Randstad | Hiring | data architect | Chennai | BigDataKB.com | 6 Oct 2022

Before u proceed below to check the jobs/CVs, please select your favorite job categories, whose top job alerts you want in your email & Subscribe to our Email Job Alert Service For FREE

 

Job Location: Chennai

summary


  • chennai, tamil nadu

  • a client of randstad india

  • permanent
  • reference number

    JPC – 74399

job details
Company: an Leading Shipping Technology CompanyLocation: ChennaiExpericne:5 -8 Years and 8-15 Years

Responsibilities for DW Architect

  • Design and implement relational and dimensional models in Snowflake and related tools.
  • Design, build and support stable, scalable data pipelines or ETL processes that cleanse, structure and integrate big data sets from multiple data sources into DW and provision to transactional systems and Business Intelligence reporting.
  • Collaborate with product and other engineering teams on normalizing and aggregating large data sets based on business needs and requirements.
  • Work on different databases (Snowflake, Postgres, MySQL, NoSql)
  • Work with different data formats (JSON, CSV, Parquet, Avro) and interact with various on-premise and cloud data sources as well as RESTful APIs.
  • Establish, maintain and administer DW clusters and other data infrastructure
  • Implement systems for tracking data quality and consistency
  • Coach and mentor team members and promote team development and growth

Qualifications for DW Architect

BigDataKB.com Jyotish
BigDataKB.com Jyotish - Career & Life Prediction
  • 5+ years of experience in a data warehousing or data engineering role
  • Must have 2+ years of experience in latest data warehouse systems
  • Experience in having designed architecture and implemented the same
  • Very strong SQL skills and experience
  • Knowledge and expertise in system performance and optimization, schema design, and, capacity planning
  • Proven ability to design, develop, and maintain data warehousing and ETL workflows for large data sets and interact with various sources
  • Knowledge of industry standard ETL and workflow tools (Airflow or other 3rd party), as well as writing your own utilizing Python is preferred
  • Working experience with Azure or Amazon Web Services (S3, EC2, Lambda, Data Pipeline)
  • Hands-on experience and expertise in using advanced Snowflake//MySql/Postgres SQL features, specifically analytical functions.
  • Strong programming and scripting skills (Python experience is a plus)

Company: an Leading Shipping Technology CompanyLocation: ChennaiExpericne:5 -8 Years and 8-15 Years

Responsibilities for DW Architect

  • Design and implement relational and dimensional models in Snowflake and related tools.
  • Design, build and support stable, scalable data pipelines or ETL processes that cleanse, structure and integrate big data sets from multiple data sources into DW and provision to transactional systems and Business Intelligence reporting.
  • Collaborate with product and other engineering teams on normalizing and aggregating large data sets based on business needs and requirements.
  • Work on different databases (Snowflake, Postgres, MySQL, NoSql)
  • Work with different data formats (JSON, CSV, Parquet, Avro) and interact with various on-premise and cloud data sources as well as RESTful APIs.
  • Establish, maintain and administer DW clusters and other data infrastructure
  • Implement systems for tracking data quality and consistency
  • Coach and mentor team members and promote team development and growth

Qualifications for DW Architect

  • 5+ years of experience in a data warehousing or data engineering role
  • Must have 2+ years of experience in latest data warehouse systems
  • Experience in having designed architecture and implemented the same
  • Very strong SQL skills and experience
  • Knowledge and expertise in system performance and optimization, schema design, and, capacity planning
  • Proven ability to design, develop, and maintain data warehousing and ETL workflows for large data sets and interact with various sources
  • Knowledge of industry standard ETL and workflow tools (Airflow or other 3rd party), as well as writing your own utilizing Python is preferred
  • Working experience with Azure or Amazon Web Services (S3, EC2, Lambda, Data Pipeline)
  • Hands-on experience and expertise in using advanced Snowflake//MySql/Postgres SQL features, specifically analytical functions.
  • Strong programming and scripting skills (Python experience is a plus)
  • experience

    17


  • skills

    • Data Architect

  • qualifications

    • B.E/B.Tech




Apply Here

Submit CV To All Data Science Job Consultants Across Bharat For Free

LEAVE A REPLY

Please enter your comment!
Please enter your name here