Job Location: India
Job Name: AWS Developer
Years of Experience: 3 – 6 years
Notice Period – 30 days or less
Job Description: We are looking for a skilled and experienced Data Engineer to join our team! As part of the team, you will be involved in the implementation of the ongoing and new initiatives for our company. If you love learning, think strategically, and innovate, and helping others, this job is for you!
Primary Skills: AWS,Python,SQL / PL SQL
Secondary Skills: Airflow, Python, DBT, Fivetran, Kafka, Looker, Tableau
Role Description: Data engineering role requires creating and managing technological infrastructure of a data platform, be in-charge / involved in architecting, building, and managing data flows / pipelines and construct data storages (noSQL, SQL), tools to work with big data (Hadoop, Kafka), and integration tools to connect sources or other databases. Role Responsibility: Translate functional specifications and change requests into technical specifications โข Translate business requirement document, functional specification, and technical specification to related coding โข Develop efficient code with unit testing and code documentation โข Ensuring accuracy and integrity of data and applications through analysis, coding, documenting, testing, and problem solving โข Setting up the development environment and configuration of the development tools โข Communicate with all the project stakeholders on the project status โข Manage, monitor, and ensure the security and privacy of data to satisfy business needs โข Contribute to the automation of modules, wherever required โข To be proficient in written, verbal and presentation communication (English) โข Co-ordinating with the UAT team
Role Requirement:
Proficient in basic and advanced SQL programming concepts (Procedures, Analytical functions etc.) โข Good Knowledge and Understanding of Data warehouse concepts (Dimensional Modeling, change data capture, slowly changing dimensions etc.) โข Knowledgeable in Shell / PowerShell scripting โข Knowledgeable in relational databases, nonrelational databases, data streams, and file stores โข Knowledgeable in performance tuning and optimization โข Experience in Data Profiling and Data validation โข Experience in requirements gathering and documentation processes and performing unit testing โข Understanding and Implementing QA and various testing process in the project โข Knowledge in any BI tools will be an added advantage โข Sound aptitude, outstanding logical reasoning, and analytical skills โข Willingness to learn and take initiatives โข Ability to adapt to fast-paced Agile environment
Additional Requirement:
Experience in ETL Tools(DBT, FiveTran, Airflow) to develop jobs for extracting, cleaning, transforming & loading data into DWH. โข Ability to translate the business requirements to Functional/technical aspects, integrate multiple data sources & databases and create database schemas โข Understanding of the threading limitations of Python, event-driven programming concept & multi-process architecture, accessibility & security compliance โข โข Knowledge of user authentication & authorization between multiple systems, servers, & environments โข Understanding of fundamental design principles for scalable application โข Knowledge on code versioning tools such as Git Experience with Amazon Web Services using Boto SDK, Jenkins and continuous Development Integration Service(CIDS) โข Strong knowledge on Amazon’s AWS offerings: RDS ,Redshift, S3, EC2, ECS, Data Pipeline, Glue, Spectrum,Lambada.
Submit CV To All Data Science Job Consultants Across Bharat For Free

Leave a Reply