Tekstrom Inc | Data Engineer -Snowflake,Matillion | The Great Bharat | Bharat | BigDataKB.com | 2023-03-09

Before u proceed below to check the jobs/CVs, please select your favorite job categories, whose top job alerts you want in your email & Subscribe to our Email Job Alert Service For FREE


Job Location: The Great Bharat

Job Detail:

Hi All,


This is Praveen from Tekstrom Inc., One of our Client is looking for Data Engineer (Snowflake/AWS/Python/Matillion) – 100% Remote

BigDataKB.com Jyotish
BigDataKB.com Jyotish - Career & Life Prediction

If you are interested share your resume to [email protected]

Title: Sr. Data Engineer (Snowflake/AWS/Python/Matillion)

Location: 100% Remote

Position type: 6+ Months Contract

Work Timings : 6:30 PM – 3:30 AM (Night Shifts)

Wednesday and Saturday must work extra hours due to deployment

Job Description


  • Deploy and maintain Snowflake warehouse to support ongoing business needs and dynamic data solutions
  • Cleanse and migrate data in support of the warehouse
  • Build Source-Target mappings for in-scope subject areas
  • Design and provision data replication processes between AWS, Azure, and Snowflake
  • Build Agile Project Backlog and establish Scrum cadence
  • Validate Snowflake and AWS environment readiness
  • Create Snowflake data-ingestion framework
  • Create and deploy Snowflake to accommodate the in-scope subject areas
  • Create all pertinent technical documentation related to deliverable activities including translating business requirements into technical requirements and technical documentation for post-deployment support, including databases, schemas, tables, views, sequences, user account, and roles within Snowflake
  • Iteratively enhance a metadata-driven ELT framework for Snowflake
  • Provide validation results and coordination with QA
  • Assemble large, complex data sets that meet business requirements
  • Identify, design, and implement a solution for automating manual processes and optimizing data delivery
  • Work with stakeholders including the Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs


  • Minimum 6 years of designing and implementing a fully operational production grade large scale data solution on Snowflake Data Warehouse
  • Must have experience in integrating data between cloud(s) and on-prem solutions
  • A solid experience and understanding of architecting, designing and operationalization of large scale data and analytics solutions on Snowflake Cloud Data Warehouse
  • Developing ETL pipelines in and out of data warehouse using combination of Python and Snowflakes SnowSQL Writing SQL queries against Snowflake
  • Develop and maintain data warehouse objects
  • Experience working within an Extraction Transform Load (ETL) software
  • Experience in working with large volumes of data Terabytes, analyze the data structures
  • Excellent knowledge of software development lifecycle (Agile), automated test and associated methodologies (Selenium) and tools and, passion for quality processes.
  • Experience with data builds (Snowflake, AWS), data integration (Databases, ETL, Informatica, Snowflake, AWS), data testing (Informatica, IDQ, Snowflake, AWS)
  • Schema design and dimensional data modeling
  • Custom ETL design, implementation and maintenance
  • Object-oriented programming languages
  • Analyzing data to identify deliverables, gaps and inconsistencies
  • Self-starter with the ability to multitask in a dynamic work environment
  • Experience working in financial technologies and mortgage loan is preferred


  • Bachelors Bachelor’s degree, preferably in Computer Science, Information Technology, Computer Engineering, or related IT discipline; or equivalent experience

Apply Here

Submit CV To All Data Science Job Consultants Across Bharat For Free


Please enter your comment!
Please enter your name here