Job Location: India
Description:
If you desire to be part of something special, to be part of a winning team, to be part of a fun team – winning is fun. We are looking forward to a Data Scientist based in Pune, India. In Eaton, making our work exciting, engaging, meaningful; ensuring safety, health, wellness; and being a model of inclusion & diversity are already embedded in who we are – it’s in our values, part of our vision, and our clearly defined aspirational goals. This exciting role offers opportunity to:
- The Data Science Engineer will be involved in the design and development of ML/AI algorithms to solve power management problems. In addition to developing these algoriths, the Data Science Engineer will also be involved in successful integration of algorithms in edge or cloud systems using CI/CD and software release process.
- The candidate will demonstrate exceptional impact in delivering projects in terms of architecture, technical deliverables and project delivery throughout the project lifecycle. The candidate is expected to be conversant with Agile methodologies and tools
- Experience on ML/AI model development and related development environment that enables ML/AI algorithms for use on various processors or systems
- Work with a team of experts in deep learning, machine learning, distributed systems, program management, and product teams, and work on all aspects of design, development and delivery of deep learning enabled end-to-end pipelines and solutions.
- Development of technical solutions and implement architectures for project and products along with data engineering and data science teams
- Participate in architecture, design, and development of new intelligent power technology products and production quality end to end systems
Qualifications
Requirement:
- Master’s degree in Data Science or Ph.D. (ongoing)in Data Science
- 1+ years of progressive experience in delivering technology solutions in a production environment
- 1+ years of experience in the software industry as a developer, with a proven track record of shipping high quality products
- 1 years working with customers (internal and external) on developing requirements and working as a solutions architect to deliver
- Masters in Data Science or Pursuing PhD in Datascience or equivalent
- Good Statistical background such as Bayesian networks, hypothesis testing, etc.- Hands on development of Deep learning and Machine learning models for Engineering applications like electrical/electronic systems, energy systems, data centers, mechanical systems
- Hands on experinece on ML/DL models such as time series modeling, anomaly detection, root cause analysis,
- Knowledge on Digital Signal Processing techniques as well as advanced signal processing techniques like Kalman filter and Particle filters
- Progamming Knowledge – Python, R, Matlab, C/C++, Java, PySpark, SparkR
- Azure ML Pipeline, Databricks, MLFlow
- Experience in deploying algorithms in on-prem edge systems or cloud
- Hands on optimization techniques like dynamic programming, particle swarm optimization, etc.
- Operating System experince – Linux (preferred)
- SW Development life-cycle process & tools
- Agile development methodologies and concepts including handson with Jira, bitbucket and confluence.
- Knowledge on Computer Vision, Natural Language Processing, Recommendation AI Systems
- Open source projects like Open CV, Gstreamer, OpenVINO, ONNX, Tensor flow, Pytorch and Caffe
- Tensor Flow, Scikit, Keras, Spark ML
- MLOps Knowledge
- Knowledge on platform container system such as Continuous Integration/Continuous Delivery (CI/CD) i.e. Jenkins, GIT, Travis-CI
- Knowledge of streaming technologies like Apache Kafka, AWS Kinesis, Azure EventHubs
- Knowledge of design and deployment of a Data Lake
- Knowledge of Cloudera Hadoop, ML ops
- Knowledge of IoT technologies, including cloud processing, like Azure IoT Hub.
- Knowledge of data analysis tools, like Apache Presto, Hive, Azure Data Lake Analytics, AWS Athena, Zeppelin
- Experience in Design Thinking or human-centered methods to identify and creatively solve customer needs, through a holistic understanding of customer’s problem area
- Advanced degree and/or specialization in related descipline (e.g. machine learning)
- Knowledgeable in leveraging multiple data transit protocols and technologies (MQTT, Rest API, JDBC, etc)
- Knowledge of Hadoop and MapReduce/Spark or related frameworks
- Knowledge of MongoDB, Document DB, CosmosDB
- Knowledge of Scala
- Excellent verbal and written communication skills including the ability to effectively communicate technical concepts as a part of virtual, global teams
- Good interpersonal, negotiation and conflict resolution skills
- Ablity to understand academic research and apply new data science techniques
- Experience being part of larger teams with established big data platform practices, as well as smaller teams where they made a bigger impact in terms of scope.
- Experience of working with global teams work Experience and awareness, Strong communication skills to interact with global teams.
- Innate curiosity
- Self-directed and hungry to learn – a person, who with time in his/her hands, will independently find interesting ways to push the envelope, learning new skills and growing Self and the team.
- Team player- we work in small, fast moving teams.
We make what matters work. Everywhere you look—from the technology and machinery that surrounds us, to the critical services and infrastructure that we depend on every day—you’ll find one thing in common. It all relies on power. That’s why Eaton is dedicated to improving people’s lives and the environment with power management technologies that are more reliable, efficient, safe and sustainable. Because this is what matters.
We are confident we can deliver on this promise because of the attributes that our employees embody. We’re ethical, passionate, accountable, efficient, transparent and we’re committed to learning. These values enable us to tackle some of the toughest challenges on the planet, never losing sight of what matters.
Job: Engineering
Region: Asia Pacific
Organization: INNOV Innovation Center
Job Level: Individual Contributor
Schedule: Full-time
Is remote work (i.e. working from home or another Eaton facility) allowed for this position?: No
Does this position offer relocation?: Relocation from within hiring country only
Travel: No
Submit CV To All Data Science Job Consultants Across Bharat For Free


