CPS Energy | Data Engineer 3 – Data & Analytics Engineering | San Antonio, TX | United States | BigDataKB.com | 2023-01-17

0
104

Job Location: San Antonio, TX

Job Detail:

We are engineers, high line workers, power plant managers, accountants, electricians, project coordinators, risk analysts, customer service operators, community representatives, safety and security specialists, communicators, human resources partners, information technology technicians and much, much more. We are 3,300 people committed to enhancing the lives of the communities we serve. Together, we are powering the growth and success of our community progress every day!


Position Summary

Provide the development and automation of computing processes to detect, predict and respond to opportunities in business operations. Working with a variety of disparate datasets that encompass many disciplines and business units including weather, transmission and distribution grid infrastructure, power generation, gas delivery, commercial market operations, safety and security and customer engagement. Strive to transform and implement true business integration, leveraging top-notch data integration best practices. Merging and securing data in a way that reduces the cost to maintain and increases the utilization of enterprise-wide data as an asset. Developing business intelligence.


GRADE: 16*

  • Qualifications may warrant placement in a different job level 1-3


DEADLINE TO APPLY: Open Until Filled


Tasks and Responsibilities

  • Design, Develop, and unit test new or existing ETL/Data Integration solutions to meet business requirements.
  • Daily production support for Enterprise Data Warehouse including ETL/ELT jobs.
  • Design and Develop data integration/engineering workflows on big data technologies and platforms (Hadoop, Spark, MapReduce, Hive, HBase, MongoDB, Druid)
  • Develop data streams using Apache Spark, Nifi and/or Kafka. Strong Python development for data transfers and extractions (ELT or ETL)
  • Develop workflows in the cloud environment using Cloud base architecture (Azure or AWS)
  • Develop dataflows and processes for the Data Warehouse using SQL (Oracle, Postgres, HIVEQL, SparkSQL & Dataframes)
  • Perform data analysis & model prototyping using Spark/Python/SQL and common data science tools & libraries (e.g. NumPy, Pandas, scikit-learn, TensorFlow)
  • Develop Data integration workflows using Web services in XML, JSON, flat file format, SOAP
  • Participate in troubleshooting and resolving data integration issues such as data quality.
  • Deliver increased productivity and effectiveness through rapid delivery of high-quality applications.
  • Provide work estimates and communicate status of assignments.
  • Assist in QA efforts on tasks by providing input for test cases and supporting test case execution.
  • Analyze transaction errors, troubleshoot issues in the software, develop bug-fixes, involved in performance tuning efforts.
  • Makes some independent decisions and recommendations which affect the section, department and/or division.
  • Participates and provides input to area budget. Works within financial objectives/budget set by management.
  • Develops alternative solutions for decision-making which support organizational goals/objectives and budget constraints.
  • Works with minimum supervision, conferring with superior on unusual matters. Incumbents have considerable freedom to decide on work priorities and procedures to be followed. May include limited supervisory responsibilities.
  • Provide reporting and analytics functionality to monitor API usage and load (overall hits, completed transactions, number of data objects returned, amount of compute time and other internal resources consumed, volume of data transferred).
  • Use results from API reporting/analytics to guide API Developer offering within an organization’s overall continuous improvement process and for defining software Service-Level Agreements for APIs.
  • Performs other duties as assigned.


Minimum Skills


Minimum Knowledge and Abilities

Proven experience in a data integration role with expert level SQL
Experience with Cloud base architecture (example: Bluemix, Google Cloud or AWS development).
Experience using Apache Spark, Nifi and/or Kafka.
Experience using Python and Spark/Python/SQL and common data science tools & libraries (e.g. NumPy, Pandas, scikit-learn, TensorFlow)
Proven experience integrating enterprise software using ETL modules/Data Engineering tools
Knowledge of data architecture, structures and principles with the ability to critique data and system designs.
Ability to design, create and/or modify data processes that meet key timelines while conforming to predefined specifications utilizing the Informatica and/or Mulesoft platform.
Experience in big data technologies and platforms (Hadoop, Spark, MapReduce, Hive, HBase, MongoDB)
Ability to integrate data from Web services in XML, JSON, flat file format, SOAP.
Knowledge of core concepts of RESTful API Modeling Language (RAML 1.0) and designing with MuleSoft solutions.
Experience with Data Science / Data Analyst and their associated tools.
Experience in Test Driven Development (TDD).
Knowledge in DevOps practices and tools (example: Jenkins, Travis CI, UrbanCode Deploy, Nagios)


Preferred Qualifications

  • Relevant Certifications
  • Ability to write code to build ETL / ELT data pipelines on premise/Cloud
  • Experience in API Management
  • Proficiency with the following databases/technologies: Mulesoft Anypoint Studio, Informatica PowerCenter, Oracle RDMS, PL/SQL, MySQL
  • Professional experience in a technology organization


Competencies

Demonstrating Initiative
Communicates Effectively
Using Computers and Technology
Driving for Results


Minimum Education

Bachelor’s Degree in Computer Science, Engineering, or related field from an accredited university.


Required Certifications


Working Environment

Indoor work, operating computer, manual dexterity, talking, hearing, repetitive motion. Use of personal computing equipment, telephone, multi-functioning printer and calculator. Ability to travel between business related events. Work hours may be extended.


Physical Demands

Office Environment

CPS Energy does not discriminate against applicants or employees. CPS Energy is committed to providing equal opportunity in all of its employment practices, including selection, hiring, promotion, transfers and compensation, to all qualified applicants and employees without regard to race, religion, color, sex, sexual orientation, gender identity, national origin, citizenship status, veteran status, pregnancy, age, disability, genetic information or any other protected status. CPS Energy will comply with all laws and regulations.




Apply Here

Submit CV To All Data Science Job Consultants Across United States For Free

🔍 Explore All Related ITSM Jobs Below! 🚀 ✅ Select your preferred "Job Category" in the Job Category Filter 🎯 🔎 Hit "Search" to find matching jobs 🔥 ➕ Click the "+" icon that appears just before the company name to see the Job Detail & Apply Link 📝💼

LEAVE A REPLY

Please enter your comment!
Please enter your name here