Zensar Technologies | GCP – Data Engineer | Poona | Bharat | BigDataKB.com | 17 Oct 2022

0

Job Location: Poona

GCP – Data Engineer (0092559_7)

Description


Mandatory Skills:


Candidate should have 5-8 years of experience in
GCP Data Engineer with BigQuery + Teradata and ELT – Data Build Too –(DBT) / (Informatica) Development and Design & Development experience:

Job description for GCP Data Engineer

Roles and Responsibilities

    Having 4 + years of relevant experience in design and development of large scale data solutions using GCP services like Bigquery,DataProc, Dataflow, Cloud Bigtable, BigQuery, Cloud SQL, Pub/Sub, Cloud Data Fusion, Cloud Composer, Cloud Functions, Cloud storage,

    Having hands on experience in building data pipelines using Cloud Dataflow, Cloud Data Fusion
    Hands on expertise in designing and developing data pipelines using cloud Dataflow with Apache Beam framework(Java/Python)
    Having good Knowledge on Apache Beam Transforms
    Performing data validations/transformations/integration using plugins on raw data using data fusion pipeline
    Expert in ETL/ ELT Processing/Data Cleansing Strong knowledge of Relational Databases Experience with google cloud data analytics services like Dataproc, Dataflow, Pub/Sub, Data Fusion, Dataprep, Cloud Composer and Big Query Experience
    Expert in Migration of data from on-prem to cloud ( Teradata, SAP HANA to GCP)
    Expertise in handling huge data using python pandas Module
    Build, maintain, monitor, and orchestrate workflows or data pipelines using Apache Airflow
    Proficiency in Apache Airflow development
    Experience with Performance tuning of the DAG and task implementation
    Develop DAG – data pipeline to on-board and change management of datasets
    Experience installing Apache Airflow, configuring, and monitoring Airflow cluster
    Experience with developing guidelines for Airflow clusters and DAG’s
    Good understanding of data warehousing concepts, architectural best practices, design patterns and performance tuning
    Experience building data processing pipelines to integrate large datasets from multiple sources and formats
    Experience designing and developing data pipelines from ingestion to consumption within big data architecture using Python, Java, SQL etc.
    Working experience in Python programming and knowledge on Linux environments
    Check code, SQL, internal database structures, and other programming constructs in search of how the information is structured, how the data moves, and how the data transforms within Data Lake or between systems.
    Good to have experience in GCP Audit logs analysis along with Monitoring & Alerting.
    Working knowledge on Cloud SDK CLI commands to automate workloads in GCP.
    Strong analytical skills and able to guide the team and prepare road map for the data migration plan into Google cloud.
    Experience in build Automation systems
    Strong knowledge or hands experience in Teradata, Informatica and SAP HANA tools and technologies for quick understanding of existing scripts and logic.
    Good knowledge and hands on experience on Data Build Tool ( DBT) and Informatica to build ELT data pipelines.

    Good Knowledge on Job Scheduling tool like – Control- M, etc.

  • SQL Knowledge:
    • Advance SQL knowledge and hands on experience on complex queries writing using with Analytical functions
    • Strong knowledge on stored procedures
    • Troubleshooting, problem solving and performance tuning of SQL queries accessing data warehouse
  • Data Build Tool ( DBT) : Good have DBT Knowledge
  • Good understanding of DBT ECO system
  • Hands on experience on DBT implementation approach and able to guide team
  • Good knowledge on DBT project setups /configurations and build Models, Snapshots, Seeds, Materializations


Nice to have skills:

  • Cisco Process and Data understanding like IB Data, Service contract Management, Auto Renewals, Services Domain, Booking, Quotes, Digital Case Services, Quote to Cash, Invoicing, Licensing, and Provisioning, Sales Compensation, Software and Recurring Revenue etc.
  • GCP Data Engineer – Certifications
  • Good to have experience in Snowflake, SAP HANA, Teradata, Informatica tools is an added advantage.
  • Experience/Knowledge with reporting technologies (i.e. Tableau, PowerBI) is an added advantage.
  • Good to have Hi-Tech Manufacturing domain experience.

Primary Location
: India-Maharashtra-Pune

Job Posting
: Oct 14, 2022

Experience Required (In Years)
: Minimum 5 Maximum 8




Apply Here

Submit CV To All Data Science Job Consultants Across Bharat For Free

🔍 Explore All Related ITSM Jobs Below! 🚀 ✅ Select your preferred "Job Category" in the Job Category Filter 🎯 🔎 Hit "Search" to find matching jobs 🔥 ➕ Click the "+" icon that appears just before the company name to see the Job Detail & Apply Link 📝💼

LEAVE A REPLY

Please enter your comment!
Please enter your name here