Data Engineer 2
San Antonio, TX 
Share
Posted 11 days ago
Job Description
Data Engineer 2

Date: May 3, 2024

Location: San Antonio, TX, US, 78205

Company: CPS Energy

We are engineers, high line workers, power plant managers, accountants, electricians, project coordinators, risk analysts, customer service operators, community representatives, safety and security specialists, communicators, human resources partners, information technology technicians and much, much more. We are 3,500 people committed to enhancing the lives of the communities we serve. Together, we are powering the growth and success of our community progress every day!

Position Summary

Provide the development and automation of computing processeson premise and in the cloud environment using cloud base architecture to detect, predict and respond to opportunities in business operations. Working with a variety of disparate datasets that encompass many disciplines and business units including weather, transmission and distribution grid infrastructure, power generation, gas delivery, commercial market operations, safety and security and customer engagement. Strive to transform and implement true business integration, leveraging top-notch data integration best practices. Merging and securing data in a way that reduces the cost to maintain and increases the utilization of enterprise-wide data as an asset. Developing business intelligence.

Grade: 14
Qualifications may warrant placement in a different job level.
Deadline to apply: Open until filled

Tasks and Responsibilities
  • Design, Develop, and unit test new or existing ETL/Data Integration solutions to meet business requirements.

  • Daily production support for Enterprise Data Warehouse including jobs in Alteryx, Hadoopand Oracle PL/SQL; and be flexible to manage high severity incidents/problem resolution.

  • Develop data integration and ETL/ELT workflows in the cloud environment using Cloud base architecture (Azure).

  • Participate in troubleshooting and resolving data integration issues such as data quality.

  • Deliver increased productivity and effectiveness through rapid delivery of high-quality applications.

  • Provide work estimates and communicate status of assignments.

  • Assist in QA efforts on tasks by providing input for test cases and supporting test case execution.

  • Analyze transaction errors, troubleshoot issues in the software, develop bug-fixes, involved in performance tuning efforts.

  • Develop Alteryx workflows and complex Oracle PL/SQL programs for the Data Warehouse.

  • Responsible for selecting and using DevOps tools for continuous integration, builds, and monitoring of solutions.

  • May provide input to area budget.

  • Makes some independent decisions and recommendations which affect the section, department and/or division.

  • Performs other duties as assigned.

Minimum Skills
Minimum Knowledge and Abilities
Experience in a data integration role.
Experience using Apache Spark, Nifi and/or Kafka.
Experience using Python.
Experience integrating enterprise software using ETL modules.
Knowledge of data architecture, structures and principles with the ability to critique data and system designs.
Ability to design, create and/or modify data processes that meet key timelines while conforming to predefined specifications utilizing the Informatica and/or Mulesoft platform.
Understanding of big data technologies and platforms (Hadoop, Spark, MapReduce, Hive, HBase, MongoDB).
Ability to integrate data from Web services in XML, JSON, flat file format, SOAP.
Knowledge of core concepts of RESTful API Modeling Language (RAML 1.0) and designing with MuleSoft solutions.
Preferred Qualifications
  • Relevant Certifications
  • Experience in API Management
  • Proficiency with the following databases/technologies: Mulesoft Anypoint Studio, Informatica PowerCenter, Oracle RDMS, PL/SQL, MySQL
  • Knowledge of Test Driven Development (TDD)
  • Familiarity with Cloud base architecture
  • Experience with data analysis & model prototyping using Spark/Python/SQL and common data science tools & libraries (e.g. NumPy, Pandas, scikit-learn, TensorFlow)
  • Experience in a technology organization
Competencies
Demonstrating Initiative
Communicates Effectively
Using Computers and Technology
Driving for Results
Minimum Education
Bachelor's degree in Computer Science, Engineering, or related field from an accredited university.
Required Certifications
Working Environment
Indoor work, operating computer, manual dexterity, talking, hearing, repetitive motion. Use of personal computing equipment, telephone, multi- functioning printer and calculator.
Ability to travel to and from meetings, training sessions or other business related events. Ability to perform after- hours and weekend work as required.
Physical Demands
Exerting up to 10 pounds of force occasionally, and/or a negligible amount of force frequently or constantly to lift, carry, push, pull or otherwise move objects, including the human body.
Sedentary work involves sitting most of the time. Jobs are sedentary if walking and standing are required only occasionally, and all other sedentary criteria are met.

CPS Energy does not discriminate against applicants or employees. CPS Energy is committed to providing equal opportunity in all of its employment practices, including selection, hiring, promotion, transfers and compensation, to all qualified applicants and employees without regard to race, religion, color, sex, sexual orientation, gender identity, national origin, citizenship status, veteran status, pregnancy, age, disability, genetic information or any other protected status. CPS Energy will comply with all laws and regulations.


Nearest Major Market: San Antonio

Job Segment: Cloud, Power Plant Operator, SQL, Database, Oracle, Technology, Energy

 

Job Summary
Company
Start Date
As soon as possible
Employment Term and Type
Regular, Full Time
Required Education
Bachelor's Degree
Required Experience
Open
Email this Job to Yourself or a Friend
Indicates required fields