Before u proceed below to check the jobs/CVs, please select your favorite job categories, whose top job alerts you want in your email & Subscribe to our Email Job Alert Service For FREE
Job Location: Detroit, MI
Job Summary
Conducts data integration and analytics projects that automate data collection, transformation, storage, delivery, and reporting processes. Ensures optimization of data retrieval and processing, including performance tuning, delivery design for down-stream analytics, machine learning modeling, feature engineering, and reporting. Works across multiple areas/teams to develop data integration methods that advance enterprise data and reporting capabilities. Span of control: 0; individual contributor.
Key Accountabilities
- Facilitates data engineering projects and collaborates with stakeholders to formulate end-to-end solutions, including data structure design to feed downstream analytics, machine learning modeling, feature engineering, prototype development, and reporting
- Works with business units, data architects, cloud engineers, and data scientists to identify relevant data, analyze data quality, design data requirements, and develop prototypes for proof-of-concepts
- Develops data sets and automated pipelines that support data requirements for process improvement and operational efficiency metrics
- Designs and implements data process pipelines in on-premise or Cloud platforms required for optimal extraction, transformation, and loading of data from multiple data sources
- Builds reporting and visualizations that utilize data pipeline to provide actionable insights into compliance rates, operational efficiency, and other key business performance metrics
- Designs and implements effective automation and testing strategies for data pipelines and processing methods
- Deploys and automates Machine Learning Models in a data environment (e.g., SQL server, Cloud platform, on-premise servers and machines), including workflow orchestration, scheduling and advanced data processing implementation, and data delivery tools
Minimum Education & Experience Requirements
This is a dual-track base requirement job; education and experience requirements can be satisfied through one of the following two options:
- Bachelor’s degree with emphasis on coursework of a quantitative nature (e.g., Computer Science, Mathematics, Physics, Data Science, Econometrics, etc.) and 3 years of experience working in a data engineering, data analytical, or computer programming function; OR
- Master’s degree with emphasis on coursework of a quantitative nature (e.g., Computer Science, Mathematics, Physics, Data Science, Econometrics, etc.) and 1 year of experience working in a data engineering, data analytical, or computer programming function.
Other Qualifications
Preferred:
- Experience with cloud platforms and cloud commuting concepts (e.g., Azure)
- Business domain knowledge
- SQL Database design and query optimization experience
- Advanced business acumen
- Experience with utility/energy industry
Other Requirements:
- Intermediate-level programming skills in structured query language (e.g., SQL)
- Intermediate-level programming skills in a modern programming language (e.g., C#, Python, R, Java, etc.)
- Experience in agile development and working with continuous integration/continuous delivery (CI/CD) pipelines
- Intermediate-level ability in articulating business questions and pulling data from relational databases
- Intermediate-level proficiency in business intelligence tools and data blending tools (e.g., Microsoft Power Platform, Power BI, etc.)
- Proficiency with Big Data platforms, including data extraction and connection to the platform for analytics (e.g., Azure Data Factory, Azure Databricks, Hive, Spark, etc.)
- Ability to adapt to software platforms for retrieving, analyzing, sharing, and visualizing data
- Ability to identify optimal analytical tools based on problem requirements
- Analytical, problem solving, planning, and decision-making skills, with ability to identify key issues from a broad range of alternatives and recommend optimal solutions
- Advances self and other’s knowledge and skillset in business processes, new analytical frameworks, and data-driven technologies and applications
- Ability to communicate technical information and complex data analytics to a non-technical audience in a clear and concise manner (in both verbal and written form)
- Ability to work overtime during peak periods; flexibility in working hours required
- Proficiency with cloud commuting concepts
- Ability to collaborate effectively with an agile team by managing expectations about delivery timelines and explaining complex technical conceps effectively
Additional Information
Incumbents may engage in all or some combination of the activities and accountabilities and utilize a variety of the competencies cited in this description depending upon the organization and role to which they are assigned. This description is intended to describe the general nature and level of work performed by incumbents in this job. It is not intended as an all-inclusive list of accountabilities or responsibilities, nor is it intended to limit the rights of supervisors or management representatives to assign, direct and control the work of employees under their supervision.
Submit CV To All Data Science Job Consultants Across United States For Free