HomeData Science Jobs - BharatRandstad | Hiring | data architect | Chennai | BigDataKB.com | 6...

Randstad | Hiring | data architect | Chennai | BigDataKB.com | 6 Oct 2022

Job Location: Chennai

summary


  • chennai, tamil nadu

  • a client of randstad india

  • permanent
  • reference number

    JPC – 74399

job details
Company: an Leading Shipping Technology CompanyLocation: ChennaiExpericne:5 -8 Years and 8-15 Years

Also Explore All freshers Data Science Jobs in Bharat & United States below, Or simply scroll down to Continue with this job/post , just hit the + symbol to see the detail & Apply link for the job that suits u... Bharat...
United States...

Responsibilities for DW Architect

  • Design and implement relational and dimensional models in Snowflake and related tools.
  • Design, build and support stable, scalable data pipelines or ETL processes that cleanse, structure and integrate big data sets from multiple data sources into DW and provision to transactional systems and Business Intelligence reporting.
  • Collaborate with product and other engineering teams on normalizing and aggregating large data sets based on business needs and requirements.
  • Work on different databases (Snowflake, Postgres, MySQL, NoSql)
  • Work with different data formats (JSON, CSV, Parquet, Avro) and interact with various on-premise and cloud data sources as well as RESTful APIs.
  • Establish, maintain and administer DW clusters and other data infrastructure
  • Implement systems for tracking data quality and consistency
  • Coach and mentor team members and promote team development and growth

Qualifications for DW Architect

  • 5+ years of experience in a data warehousing or data engineering role
  • Must have 2+ years of experience in latest data warehouse systems
  • Experience in having designed architecture and implemented the same
  • Very strong SQL skills and experience
  • Knowledge and expertise in system performance and optimization, schema design, and, capacity planning
  • Proven ability to design, develop, and maintain data warehousing and ETL workflows for large data sets and interact with various sources
  • Knowledge of industry standard ETL and workflow tools (Airflow or other 3rd party), as well as writing your own utilizing Python is preferred
  • Working experience with Azure or Amazon Web Services (S3, EC2, Lambda, Data Pipeline)
  • Hands-on experience and expertise in using advanced Snowflake//MySql/Postgres SQL features, specifically analytical functions.
  • Strong programming and scripting skills (Python experience is a plus)

Company: an Leading Shipping Technology CompanyLocation: ChennaiExpericne:5 -8 Years and 8-15 Years

Responsibilities for DW Architect

  • Design and implement relational and dimensional models in Snowflake and related tools.
  • Design, build and support stable, scalable data pipelines or ETL processes that cleanse, structure and integrate big data sets from multiple data sources into DW and provision to transactional systems and Business Intelligence reporting.
  • Collaborate with product and other engineering teams on normalizing and aggregating large data sets based on business needs and requirements.
  • Work on different databases (Snowflake, Postgres, MySQL, NoSql)
  • Work with different data formats (JSON, CSV, Parquet, Avro) and interact with various on-premise and cloud data sources as well as RESTful APIs.
  • Establish, maintain and administer DW clusters and other data infrastructure
  • Implement systems for tracking data quality and consistency
  • Coach and mentor team members and promote team development and growth

Qualifications for DW Architect

  • 5+ years of experience in a data warehousing or data engineering role
  • Must have 2+ years of experience in latest data warehouse systems
  • Experience in having designed architecture and implemented the same
  • Very strong SQL skills and experience
  • Knowledge and expertise in system performance and optimization, schema design, and, capacity planning
  • Proven ability to design, develop, and maintain data warehousing and ETL workflows for large data sets and interact with various sources
  • Knowledge of industry standard ETL and workflow tools (Airflow or other 3rd party), as well as writing your own utilizing Python is preferred
  • Working experience with Azure or Amazon Web Services (S3, EC2, Lambda, Data Pipeline)
  • Hands-on experience and expertise in using advanced Snowflake//MySql/Postgres SQL features, specifically analytical functions.
  • Strong programming and scripting skills (Python experience is a plus)
  • experience

    17


  • skills

    • Data Architect

  • qualifications

    • B.E/B.Tech




Apply Here

Submit CV To All Data Science Job Consultants Across Bharat For Free

Explore Other Similar Data Science Jobs wpDataTable with provided ID not found!
RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

How To Find Best Data Science Course

How To Get Guaranteed Data Science Jobs