Job Location: Chennai
Designs, develops, tests, debugs and implements data ingestion and processing pipelines to drive business impact with full end-to-end ownership of the development cycle. Gain a strong understanding of the business process, transforming this into data business logic, and turning a complex idea into a sustainable deliverable.
Responsibility Summary
- Data Wrangling: Develop data flows and models from multiple sources with varying degrees of complexity and data transformations.
- Implement data quality testing to identify data issues, validating the logic and outputs designed.
- Collaborate with subject matter experts to test and validate data deliverables.
- Ensures that all code is well structured, includes sufficient documentation, and is easy to maintain and reuse.
- Analysis: Ability to communicate the technical dependencies to the concerned teams and establish coordination for seamless implementations.
- Collaborate with subject matter experts to gain an understanding of business functions and needs.
- Translates business needs into design specifications and code.
- Interpret use case requirement and design of target data model/ data mart.
- Operational: Produce appropriate documentation and notifications for changes to production systems or fixes to production problems.
- Proactively identify opportunities for efficiencies by documenting, monitoring, and improving production processes and controls.
- Provide production on-call support on normal rotations and takes ownership of incidents bringing them to closure by troubleshooting and resolving support issues quickly while summarizing and reporting the results.
- Troubleshoot and debug when data is found to be inaccurate.
Functional/Technical Qualifications:
- Design and development with knowledge of DW architecture, data modeling, data normalization and ETL processes.
- SQL knowledge and experience having worked with relational databases, managing medium-large data sets, query authoring, and complex stored procedures.
- Knowledge of data pipeline orchestration tools such as Informatica, Wherescape, Airflow, Pentaho, Stich, etc.
- Proficiency interacting with database and file storage systems like SQL, Oracle, Redshift.
- Exposed to object-oriented/object function scripting languages like Python, Powershell, etc.
- Optional experience with visualization tools such as QlikSense, PowerBi, or others.
- Optional experience in working with SSIS.
Competency Expectations
- Communication: Good communications skills to effectively collaborate and communicate with internal and external contacts.
- Ability to communicate technical information to non-technical audiences.
- Collaborate well with associates throughout the organization.
- Acumen to cultivate and develop lasting customer relationships.
- Problem Solving: Problems faced are difficult and often complex.
- Strong analytical, critical and systems thinking required.
- Provides solutions to a variety of complex technical and business matters.
- Identify the requirement(s), analyze and develop solutions.
- Ability execute on projects of a moderate + complexity which involve cross-functional teams.
- Autonomy: Professional individual contributor. Works independently with limited supervision.
- Ability to prioritize, manage and execute on multiple tasks/projects and adjust to changing priorities.
- Independently organize, plan prioritize individual effort to ensure overall success.
- Works in general shifts and willing to overlap with multiple time zones (as needed).
Education:
- Four year college degree (or additional relevant experience in a related field).
- Minimum 4 years functional experience.
Submit CV To All Data Science Job Consultants Across India For Free
๐ Explore All Related ITSM Jobs Below! ๐
โ
Select your preferred “Job Category” in the Job Category Filter ๐ฏ
๐ Hit “Search” to find matching jobs ๐ฅ
โ Click the “+” icon that appears just before the company name to see the Job Detail & Apply Link ๐๐ผ

Leave a Reply