Before u proceed below to check the jobs/CVs, please select your favorite job categories, whose top job alerts you want in your email & Subscribe to our Email Job Alert Service For FREE
Job Location: Milwaukee, WI
Job Detail:
Job Description:
- Candidates should have experience in building and deploying machine learning models using the.
- Databricks platform.
- Experience working on PySpark (the Python API for Spark), the Databricks.
- Machine Learning Runtime, and other popular machine learning libraries like TensorFlow and scikit-learn.
- The Developer would work closely with data engineers and data scientists to understand the requirements of the business, and then design and implement models that meet those needs.
- This role is also responsible for monitoring the models ‘ performance and making necessary adjustments.
- Additionally, It would require integration of the developed models with the rest of the data processing pipeline and applications.
- Understanding the business problem that the model is being used to solve, and identifying appropriate data sets and features for model training.
- Cleaning and preprocessing data to make it suitable for model training and evaluation.
- Selecting appropriate algorithms and techniques for training and evaluating regression models.
- Tuning model parameters to optimize performance.
- Evaluating the performance of the model using metrics such as mean squared error or R- squared.
- Monitoring and maintaining the model in a production environment, including making updates and adjustments as needed.
- AWS Data Engineer with experience on Databricks.
- Good Knowledge on Statistical and mathematical functions.
- Collaborating with data engineers, data scientists, and other stakeholders to ensure that the model meets business requirements and delivers value.
- Application development, coding, testing, design reviews, database design, implementation for Field Rewards Data Platform to server all Field rewards applications such as Sales Reporting Rewards, Recognition and Distribution that will be exposing data to downstream applications through API.
- Building a data hub using Databrick on AWS.
- Developing Advanced Analytics using AWS for Field Rewards ensures fast querying to run Data Analytics on a massive volume of data and feed data to different Business Intelligence Tools, Dashboards, and other applications.
- Field Rewards data wants to extend its capability in Data Analytics space by integrating with enterprise Unified Data Platform (UDP). Once integrated with UDP, Field rewards want to deliver capabilities in advanced data analytics space.
- Develop database architectural strategies at the modeling, design and implementation stages to address business or industry requirements – Data Maintenance, Database Security and Database Management
- Integrate with Unified Data Platform by setting up pipeline (real-time batch) to bring data from Field Reward data hub to Unified Data Platform.
- PySpark, ETL and Data Frame knowledge is a must.
Scope of work:
- Integrate with UDP by setting up a pipeline (real-time batch) to bring data from Field Reward data hub to UDP.
- Daily Pre-Distribution data to UDP
- Pre-distribution historical data to UDP.
- Sales Reporting data (current historical) to UDP.
- Rewards Recognition data (current historical) to UDP.
- FRM 2022 (Tax, Minimum earnings Bonus) data to UDP.
Submit CV To All Data Science Job Consultants Across United States For Free