DTDC | Jobs | Data Engineer | BigDataKB.com | 07-02-22

    0

    Job Location: Bangalore/Bengaluru

    Roles and Responsibilities

    • Create and manage optimal data pipeline architecture by designing and implementing data ingestion solutions on AWS/GCP using AWS/GCP native services or data management technologies such as Talend or PDI or Python.
    • Integrate and assemble large & complex data sets using AWS/GCP services such as Glue/Data flow that meet a broad range of business requirements.
    • Design and optimize data models on AWS Redshift and Google Big Query.
    • Data discoveries aligning to business needs and maintain data quality
    • Create and maintain data structures in Data warehouse
    • Creating data presentation layers from Datawarehouse using BI tools

    Desired Candidate Profile

    • Bachelor of Engineering or Equivalent in Computer Science/Information Technology
    • Overall 5 years of experience
    • At least 3 years of experience with data engineering emphasis
    • Proficient in AWS Services – S3, Redshift, Glue, Lambda & IAM roles
    • Proficient GCP data services
    • Strong experience with SQL on Relational database such as Postgres & MariaDB
    • Exposure to large databases, data quality and performance tuning
    • Experience developing with Python
    • Good practical understanding of file formats including JSON, Parquet and others
    • Knowledge of Visualization tool Tableau will be an added advantage

    Apply Here

    Submit CV To All Data Science Job Consultants Across India For Free

    🔍 Explore All Related ITSM Jobs Below! 🚀 ✅ Select your preferred "Job Category" in the Job Category Filter 🎯 🔎 Hit "Search" to find matching jobs 🔥 ➕ Click the "+" icon that appears just before the company name to see the Job Detail & Apply Link 📝💼

    LEAVE A REPLY

    Please enter your comment!
    Please enter your name here