Job Location: San Antonio, TX
Job Detail:
We are engineers, high line workers, power plant managers, accountants, electricians, project coordinators, risk analysts, customer service operators, community representatives, safety and security specialists, communicators, human resources partners, information technology technicians and much, much more. We are 3,300 people committed to enhancing the lives of the communities we serve. Together, we are powering the growth and success of our community progress every day!
Position Summary
Provide the development and automation of computing processes to detect, predict and respond to opportunities in business operations. Working with a variety of disparate datasets that encompass many disciplines and business units including weather, transmission and distribution grid infrastructure, power generation, gas delivery, commercial market operations, safety and security and customer engagement. Strive to transform and implement true business integration, leveraging top-notch data integration best practices. Merging and securing data in a way that reduces the cost to maintain and increases the utilization of enterprise-wide data as an asset. Developing business intelligence.
GRADE: 16*
- Qualifications may warrant placement in a different job level 1-3
DEADLINE TO APPLY: Open Until Filled
Tasks and Responsibilities
- Design, Develop, and unit test new or existing ETL/Data Integration solutions to meet business requirements.
- Daily production support for Enterprise Data Warehouse including ETL/ELT jobs.
- Design and Develop data integration/engineering workflows on big data technologies and platforms (Hadoop, Spark, MapReduce, Hive, HBase, MongoDB, Druid)
- Develop data streams using Apache Spark, Nifi and/or Kafka. Strong Python development for data transfers and extractions (ELT or ETL)
- Develop workflows in the cloud environment using Cloud base architecture (Azure or AWS)
- Develop dataflows and processes for the Data Warehouse using SQL (Oracle, Postgres, HIVEQL, SparkSQL & Dataframes)
- Perform data analysis & model prototyping using Spark/Python/SQL and common data science tools & libraries (e.g. NumPy, Pandas, scikit-learn, TensorFlow)
- Develop Data integration workflows using Web services in XML, JSON, flat file format, SOAP
- Participate in troubleshooting and resolving data integration issues such as data quality.
- Deliver increased productivity and effectiveness through rapid delivery of high-quality applications.
- Provide work estimates and communicate status of assignments.
- Assist in QA efforts on tasks by providing input for test cases and supporting test case execution.
- Analyze transaction errors, troubleshoot issues in the software, develop bug-fixes, involved in performance tuning efforts.
- Makes some independent decisions and recommendations which affect the section, department and/or division.
- Participates and provides input to area budget. Works within financial objectives/budget set by management.
- Develops alternative solutions for decision-making which support organizational goals/objectives and budget constraints.
- Works with minimum supervision, conferring with superior on unusual matters. Incumbents have considerable freedom to decide on work priorities and procedures to be followed. May include limited supervisory responsibilities.
- Provide reporting and analytics functionality to monitor API usage and load (overall hits, completed transactions, number of data objects returned, amount of compute time and other internal resources consumed, volume of data transferred).
- Use results from API reporting/analytics to guide API Developer offering within an organization’s overall continuous improvement process and for defining software Service-Level Agreements for APIs.
- Performs other duties as assigned.
Minimum Skills
Minimum Knowledge and Abilities
Preferred Qualifications
- Relevant Certifications
- Ability to write code to build ETL / ELT data pipelines on premise/Cloud
- Experience in API Management
- Proficiency with the following databases/technologies: Mulesoft Anypoint Studio, Informatica PowerCenter, Oracle RDMS, PL/SQL, MySQL
- Professional experience in a technology organization
Competencies
Minimum Education
Required Certifications
Working Environment
Physical Demands
CPS Energy does not discriminate against applicants or employees. CPS Energy is committed to providing equal opportunity in all of its employment practices, including selection, hiring, promotion, transfers and compensation, to all qualified applicants and employees without regard to race, religion, color, sex, sexual orientation, gender identity, national origin, citizenship status, veteran status, pregnancy, age, disability, genetic information or any other protected status. CPS Energy will comply with all laws and regulations.
Submit CV To All Data Science Job Consultants Across United States For Free