Job Location: Bangalore/Bengaluru
What You Will Do
- Build a highly scalable and fault tolerant data streaming platform to process and analyze a large volume of messages in real time
- Oversee containerization and deployment of microservices on Kubernetes
- Design and deploy the data engineering applications on Google Cloud Platform
- Manage the auto-scaling and monitoring performances of the Infrastructure using InfluxDB and Grafana, or similar tools
- Collaborate with the team to ensure that the service level objectives and agreements are achieved
- Set up Infrastructure as a code using Terraform and Chef
What You Will Need
- At least 6 years of data engineering, with experience in leading a team and building data engineering applications on a large-scale distributed computing infrastructure in a cloud environment
- In-depth experience in containerization, including Docker and Kubernetes
- Working with large scale production Kafka clusters
- Experience building real-time streaming applications using Kafka and Flink
- Experience working with Agile methodologies, Test Driven Development and implementing CI/CD pipelines using Gitlab and Docker
- Good knowledge of automating and provisioning infrastructure deployments using Terraform and Chef/Ansible on a cloud platform like GCP
- Experience managing distributed systems like Elasticsearch/MongoDB and a server- less data warehouse like BigQuery or Redshift
Submit CV To All Data Science Job Consultants Across India For Free
🔍 Explore All Related ITSM Jobs Below! 🚀
✅ Select your preferred "Job Category" in the Job Category Filter 🎯
🔎 Hit "Search" to find matching jobs 🔥
➕ Click the "+" icon that appears just before the company name to see the Job Detail & Apply Link 📝💼

