Job Location: Bangalore
Cloud Data Engineer
Job Description
Design, develop, optimize, and maintain Cloud Data Platform pipelines that adhere to ETL principles and business goals โข Utilize conceptual knowledge of data and analytics, such as dimensional modeling, ETL, reporting tools, data governance, data warehousing, structured and unstructured data to solve complex data problems โข Work with business/application/solution teams to implement data strategies, build data flows, & develop conceptual/logical/physical data models โข Develop, and maintain conceptual, logical, and physical data models, implement of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms in cloud โข Work with the database engineering and DBAs to create optimal physical data models (transactions, normalized and dimensional models; etc.,) โข Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models โข Hands-on experience modeling, design, configuration, installation, performance tuning, and setting up sandbox for POC โข Lead/Partner with business analyst and solutions architects to conduct detailed technical assessments of enterprise data architecture, define path to transform it into a modern data powered enterprise for strategic enterprise projects and initiatives โข Lead/Partner with different teams and provide deep technical expertise to evaluate, implement & deploy emerging tools and large scale data solutions for data engineering to improve productivity as a team โข Conduct full technical discovery, identifying pain points, business and technical requirements, โas isโ and โto beโ scenarios โข Compare solution alternatives across both technical and business parameters which support the define cost and service requirements โข Create detailed target state technical, security, data and operational architecture and design blueprints incorporating modern data technologies and cloud data services demonstrating modernization value proposition โข Lead scoping sessions to generate estimates and approaches for execution โข Solve complex data problems to deliver insights that helps our business to achieve their goals โข Create data products for analytics and data scientist team members to improve their productivity โข Stay educated on new and emerging technologies/patterns/methodologies and market offerings/Learn about machine learning, data science, computer vision, artificial intelligence, statistics, and/or applied mathematics โข Develop and deliver communication & education plans on analytic data engineering capabilities, standards, and processes โข Advise, consult, mentor and coach other data and analytic professionals on data standards and practices โข Experience with developing solutions on cloud computing services and infrastructure in the data and analytics space (Azure or GCP) โข Experience with tools like Azure Data Factory, Databricks, Integration Runtime Services โข Experience in Python or Scala โข Expert in T-SQL โข Big Data development experience using Kafka . Azure EventHub, IoT Hub โข Familiarity with the Linux operating system โข Bachelorโs degree โข Minimum 1+ years of experience in architecting large-scale data solutions, performing architectural assessments, crafting architectural options and analysis, finalizing preferred solution alternative working with IT and Business stakeholders โข 1+ years of data engineering or architecture experience, architecting, developing, and deploying scalable enterprise data analytics solutions โข 3+ years of IT experience and equivalent blend of education and experience โข Experience designing, building and operationalizing large-scale enterprise data solutions and applications using one or more of cloud data and analytics services in combination with 3rd parties โข Hands-on experience analyzing, re-architecting and re-platforming on-premise data warehouses to data platforms on cloud โข Experience designing and building production data pipelines from ingestion to consumption within a hybrid big data architecture โข Experience designing and implementing data engineering, ingestion and curation functions on cloud using native or custom programming โข Hands-on experience designing and implementing data ingestion solutions on cloud โข Hands-on experience architecting and designing data lakes on cloud serving analytics and BI application integrations โข Experience design/development of Large Data Warehouse and/or Database Management Systems โข Expertise in data modeling principles/methods including, Conceptual, Logical & Physical Data Models โข Demonstrated experience modeling high performance multi-dimensional Data Marts and Reporting structures including OLAP cubes โข 1+ years of hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols) โข 1+ years of experience with database development (SQL, PL/SQL) and scripting
โข Experience with developing solutions on cloud computing services and infrastructure in the data and analytics space (Azure or GCP) โข Experience with tools like Azure Data Factory, Databricks, Integration Runtime Services โข Experience in Python or Scala โข Expert in T-SQL โข Big Data development experience using Kafka . Azure EventHub, IoT Hub โข Familiarity with the Linux operating system โข Minimum 3+ years of experience integrating cloud data services for building secure data solutions โข Minimum 3+ years of experience introducing and operationalizing self-service data preparation tools (e.g. Collibra, Alation , Babylon) โข Minimum 3+ years of architecting and operating large production SAP/NoSQL clusters on premise or using Cloud services โข Minimum 1+ years architecting and implementing metadata on cloud โข Architecting and implementing data governance and security for data platforms on cloud โข Designing operations architecture and conducting performance engineering for large scale data lakes a production environment โข Craft and lead client design workshops and provide tradeoffs and recommendations towards building a solutions โข Bachelor’s degree in Computer Science, Engineering, Technical Science or 3+ years of architecture and build experience with large scale solutions
Global VISA and Relocation Specifications:
K-C requires that an employee have authorization to work in the country in which the role is based. In the event an applicant does not have current work authorization, K-C will determine, in its sole discretion, whether to sponsor an individual for work authorization. However, based on immigration requirements, not all roles are suitable for sponsorship.
This role is available for local candidates already authorized to work in the roleโs country only. K-C will not provide relocation support for this role.
Primary Location
IT Centre Bengaluru GDTC
Additional Locations
Worker Type
Employee
Worker Sub-Type
Regular
Time Type
Full time
Submit CV To All Data Science Job Consultants Across India For Free

Leave a Reply