Job Location: Bangalore/Bengaluru
Roles and Responsibilities
About us :
Headquartered in Dublin, Ohio, Cardinal Health, Inc. (NYSE: CAH) is a global, integrated healthcare services and products company connecting patients, providers, payers, pharmacists and manufacturers for integrated care coordination and better patient management. Backed by nearly 100 years of experience, with more than 50,000 employees in nearly 60 countries, Cardinal Health ranks among the top 20 on the Fortune 500.
Department Overview
augmented Intelligence (augIntel) builds automation, analytics and artificial intelligence solutions that drive success for Cardinal Health by creating material savings, efficiencies and revenue growth opportunities. The team drives business innovation by leveraging emerging technologies and turning them into differentiating business capabilities.
Job Overview
Designing, building and operationalizing large-scale enterprise data solutions and applications using one or more of Google Cloud Platform data and analytics services in combination with technologies like Spark, Cloud DataProc, Cloud Dataflow, Apache Beam, BigTable, Cloud BigQuery, Cloud PubSub, Cloud Functions, Airflow.
Responsibilities :
- Designing and implementing data transformation, ingestion and curation functions on GCP cloud using GCP native or custom programming
- Designing and building production data pipelines from ingestion to consumption within a hybrid big data architecture, using Java, Python etc.
- Performing detail assessments of current state data platforms and creating an appropriate transition path to GCP cloud.
- Analyzing, re-architecting and re-platforming on-premise data warehouses to data platforms on GCP cloud using GCP/3rd party services
- Optimizing data pipelines for performance and cost for large scale data lakes
Desired Qualifications:
- Hands-on GCP experience with a minimum of 1 solution designed and implemented at production scale
- 3+ years of experience writing complex SQL queries, stored procedures, etc
- Hands-on experience architecting and designing data lakes on GCP cloud serving analytics and BI application integrations
- Experience in designing and optimizing data models on GCP cloud using GCP data stores such as BigQuery, BigTable
- Experience integrating GCP or 3rd party KMS, HSM with GCP data services for building secure data solutions
- Experience introducing and operationalizing self-service data preparation tools (e.g. Trifacta, Paxata) on GCP
- Experience architecting and implementing metadata management on GCP Architecting and implementing data governance and security for data platforms on GCP
- Agile development skills and experience.
- Experience with CI/CD pipelines such as Concourse, Jenkins Dimensional modeling using tools like AtScale
- Google Cloud Platform certification is a plus
Submit CV To All Data Science Job Consultants Across India For Free
🔍 Explore All Related ITSM Jobs Below! 🚀
✅ Select your preferred "Job Category" in the Job Category Filter 🎯
🔎 Hit "Search" to find matching jobs 🔥
➕ Click the "+" icon that appears just before the company name to see the Job Detail & Apply Link 📝💼

