Job Location: Vadodara
If you are a smart & passionate team player – then this Data Engineer opportunity is for you!
We at IMRIEL are looking for a Data Engineer to implement methods to improve data
reliability and quality. You will combine the raw information from different sources to create
consistent and machine-readable formats. You will also develop and test architectures that
enable data extraction and transformation for predictive or prescriptive modeling.
Resourcefulness is a necessary skill in this role. If you truly love gaining new technical
knowledge and can add more awesomeness to the team, you are eligible!
What you’ll be doing:
- Work on the latest tools and technologies.
- Work on various data modelling methods.
- Work very closely with the team of Software Developers & Testers.
- Diagnose and resolve problems quickly.
- Design the right tool/framework for development & deployment.
- Observe existing patterns and recognize ways to change them and improve the
product & development methodologies.
- Get involved in the Data Migration & DevOps process and work closely with the
Software Developers.
- Automate as much as possible.
- Focus on the quality of the delivery and follow the best practices.
What you need:
Basic Skills:
- Bachelors or Masters Degree in Computer Science or equivalent, professional
certification (e.g. Microsoft Azure Data Engineer, AWS Data Engineer, etc.) is a plus.
- Minimum 2 years Experience in designing and developing solutions using AWS
services such as Lamdba, Glue, SQS, SNS, Redshift, etc. or Azure functions, Azure Data
Factory, Azure Databricks, Azure Data Lake, Azure Synopse, Azure SQL DB, Cosmos DB,
MS-SQL, etc.
- Research new technologies and data modelling methods
- Knowledge of Azure Data Analytics, Big Data, Data Warehousing, ETL, MDM
Projects/Products
- Hands-on experience in implementation of Data Warehouse projects
- Create conceptual, logical and physical data models for both relational and
dimensional solutions
- Effectively managing large sets of data for performance, security and data governance
view point
- Deep knowledge of wide variety of database systems and data mining techniques
Email: careers@imriel.com www.imriel.com
- Experience to build and test physical data models for a variety of use cases meeting
performance and efficiency goals
- To have good knowledge of relational databases like Oracle, PostgreSQL, MySQL
- To have good knowledge of unstructured data systems like MongoDB, Cassandra,
- Knowledge of graph databases, streaming analytics, HDInsight, etc.
- Experience in EMR/MSK/ElasticSearch/Neptune, etc.
- Excellent in Logical and Physical Data Modelling using data modelling tools like Erwin,
Enterprise Architect, etc.
- Develop object-oriented code using Python, besides PySpark, SQL and other languages
- Developing ETL pipelines in and out of data warehouse using combination of Python
and Snowflakes Snow SQL
- Writing SQL queries against Snowflake.
- Developing scripts Unix, Python etc. to do Extract, Load and Transform data.
- Knowledge of implementing Redshift on AWS
- Experience building pipelines and orchestration of workflows in an enterprise
environment
- Experience on Streaming technologies both OnPrem/Cloud such as consuming and
producing from Kafka, Kinesis, etc.
- Experience implementing batch processing using AWS Glue/Lakeformation, Lambda
& Data Pipeline
- Experience in optimizing the cost for service being utilized
- Perform ongoing monitoring, automation and refinement of data engineering
solutions
- Must be highly collaborative and be able to work in a team environment, with
technical and Business people
- Excellent communication, problem-solving, and customer service skills with the ability
to translate technical detail into non-technical information
- Strong analytical skills and care about how your work gets done.
- Eager to learn, no matter how successful you might already be!
Responsibilities:
- As Data Analytics understand the Azure or AWS Data Platform tools
- Work on Data Factory, Data Bricks, Synopse, Data Lake Gen2, Stream Analytics,
Azure Spark, Azure ML, AWS SageMaker, AWS Glue, EMR, Neptune, etc.
- Work on databases like: Oracle, SQL Server DB, MySQL, Cosmos DB, etc.
- Developing ETL pipelines in and out of data warehouse using combination of Python
and Snowflakes Snow SQL.
- Writing SQL queries against Snowflake.
- Developing scripts Unix, Python etc. to do Extract, Load and Transform data.
- Collaborate with project stakeholders like database administrators, technical
architects, , business analysts, security experts, information modelling experts to
determine project needs and plan development and implementation strategies
Email: careers@imriel.com www.imriel.com
- Define, review, and explain Data Architecture requirements & design to all the
project stakeholders
- Lead the migration of data from legacy systems to newly developed solution.
- To create strategies and design solutions for wide variety use cases like Data
Migration (end to end ETL process), database optimization, data architectural
solutions for Analytics and Big Data Projects.
- Design, develop and troubleshoot highly complex technical problems in
OLAP/OLTP/DW, Analytics, Big Data environments and provide solutions for
Enterprise level Applications.
- Implement data quality processes for use with MDM, BI solutions, data warehouses,
EAI solutions, etc.
- Work on streamlining data flows and data models consistently.
- Work on streaming technologies both OnPrem/Cloud such as consuming and
producing from Kafka, Kinesis, etc.
- Have a keen focus on improving and tuning data quality, accessibility, performance
and security needs
- Identifying technical problems and developing software updates and fixes.
- Working with software developers and software engineers to ensure that
development follows established processes and works as intended
- Planning out projects and being involved in project management decisions.
- Build an internal wiki with technical documentation, manuals and IT policies.
Good to know:
- Infrastructure as a Code.
- Knowledge of working with Data mesh.
- Knowledge of Kafka, Snowflake, etc.
- Certifications in Architecture, Analytics, Data Science, Big Data, Cloud will be added
advantage
- Knowledge of Google Cloud Platform.
Personal Attributes:
- A passion for continuous improvement in both technology and process
- Strong interpersonal, problem solving, and organizational skills
Benefits:
- 5 days work & flexible working hours.
- 25 paid leaves in one year (along with 10 other festive holidays) + Leave encashment.
- Good Medical insurance covering spouse and kids.
- Maternity & Paternity leave benefits.
- Wellness program and counselling.
- 6-monthly appraisal cycle.
- Yearly performance bonus based on projects.
- Strategic growth with direct client interaction at all levels.
- Onsite opportunity based on the project requirement.
- Trainings, Certifications, Events, Activities & Outings.
Submit CV To All Data Science Job Consultants Across India For Free

Leave a Reply