Job Location: Chennai
AWS,Python,SQL / PL SQL
Secondary Skills:
Airflow , Python , DBT , Fivetran , Kafka , Looker , Tableau
Role Description:
Data engineering role requires creating and managing technological infrastructure of a data platform, be in-charge / involved in architecturing, building managing data flows/ pipelines and construct data storages (noSQL, SQL), tools to work with big data (Hadoop, Kafka), and integration tools to connect sources or other databases.
Role Responsibility:
- Translate functional specifications and change requests into technical specifications
- Translate business requirement document, functional specification, and technical specification to related coding
- Develop efficient code with unit testing and code documentation
- Ensuring accuracy and integrity of data and applications through analysis, coding, documenting, testing, and problem solving
- Setting up the development environment and configuration of the development tools
- Communicate with all the project stakeholders on the project status
- Manage, monitor, and ensure the security and privacy of data to satisfy business needs
- Contribute to the automation of modules, wherever required
- To be proficient in written, verbal and presentationcommunication (English)
- Co-ordinating with the UAT team
Role Requirement:
- Proficient in basic and advanced SQL programming concepts (Procedures, Analytical functions etc.)
- Good Knowledge and Understanding of Data warehouse concepts (Dimensional Modeling, change data capture, slowly changing dimensions etc.)
- Knowledgeable in Shell / PowerShell scripting
- Knowledgeable in relational databases, nonrelational databases, data streams, and file stores
- Knowledgeable in performance tuning and optimization
- Experience in Data Profiling and Data validation
- Experience in requirements gathering and documentation processes and performing unit testing
- Understanding and Implementing QA and various testing process in the project
- Knowledge in any BI tools will be an added advantage
- Sound aptitude, outstanding logical reasoning, and analytical skills
- Willingness to learn and take initiatives
- Ability to adapt to fast-paced Agile environment
Additional Requirement :
- Experience in ETL Tools(DBT, FiveTran, Airflow) to develop jobs for extracting, cleaning, transforming loading data into DWH.
- Ability to translate the business requirements to Functional/technical aspects, integrate multiple data sources databases and create database schemas
- Understanding of the threading limitations of Python, event-driven programming concept multi-process architecture, accessibility security compliance
- Knowledge of user authentication authorization between multiple systems, servers, environments
- Understanding of fundamental design principles for scalable application
- Knowledge on code versioning tools such as Gitz
- Experience with Amazon Web Services using Boto SDK, Jenkins and continuous Development Integration Service(CIDS)
- Strong knowledge on Amazon s AWS offerings: RDS ,Redshift, S3, EC2, ECS, Data Pipeline, Glue, Spectrum, Lambda
Submit CV To All Data Science Job Consultants Across Bharat For Free
🔍 Explore All Related ITSM Jobs Below! 🚀
✅ Select your preferred "Job Category" in the Job Category Filter 🎯
🔎 Hit "Search" to find matching jobs 🔥
➕ Click the "+" icon that appears just before the company name to see the Job Detail & Apply Link 📝💼

