Techcospace | Jobs | AWS Data Engineer – Python/PySpark – Remote | BigDataKB.com | 18-02-22

    0

    Job Location: Delhi / NCR

    Primary Skill Set:

    • Python, PySpark, S3, AWS Glue, Lambda, EC2 and SQL, Apache Nifi

    Job Description :

    • Experience in Perform Design, Hands-on development & Deployment using AWS Services ( S3, AWS Glue, Lambda, Python, EC2 and SQL).
    • Experience in optimal extraction, transformation, and loading of data from a wide variety of data sources using AWS Big Data technologies.
    • Preferably experienced with data warehouse or data lake (Redshift, BigQuery,

    Snowflake).

    • Develop and maintain scalable data pipelines and build out new Data Source integrations to support continuing increases in data volume and complexity.
    • Experience building and optimizing Big Data data pipelines and data sets.
    • Must possess good client-facing experience with the ability to facilitate requirements sessions and lead teams.
    • Collaborate and engage with BI & analytics and business teams.
    • Good to have experience with NoSQL databases like MongoDB.

    Educational Qualification Preferred:

    • B.E/B.Tech or any other related field

    Experience Preferred:

    • 2+ years of work experience

    Apply Here

    Submit CV To All Data Science Job Consultants Across India For Free

    🔍 Explore All Related ITSM Jobs Below! 🚀 ✅ Select your preferred "Job Category" in the Job Category Filter 🎯 🔎 Hit "Search" to find matching jobs 🔥 ➕ Click the "+" icon that appears just before the company name to see the Job Detail & Apply Link 📝💼

    LEAVE A REPLY

    Please enter your comment!
    Please enter your name here