Data Engineer, Advisor at Edison International
Rosemead, California, USA -
Full Time


Start Date

Immediate

Expiry Date

09 Sep, 25

Salary

236700.0

Posted On

10 Jun, 25

Experience

0 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Azure, Cloud Computing, Computer Science, Work Management, Aws, Python, Information Systems, Data Transformation, Outage Management, Aggregation, Asset Management

Industry

Information Technology/IT

Description

MINIMUM QUALIFICATIONS

  • Bachelor’s Degree in Computer Science, Information Systems, Engineering, Statistics/ Mathematics or equivalent STEM major
  • Seven or more years of experience in data processing large data sets, hands-on with data transformation, aggregation, and filtering.

PREFERRED QUALIFICATIONS

  • Three or more years of experience with electric utility functional domains data (Asset Management, Work Management, Outage Management, Entergy Procurement, Grid Ops, etc).
  • Cloud Data Engineer certifications: GCP, Azure, and/or AWS.
  • One or more years of experience with cloud computing.
  • Three or more years of experience with Python, PySpark, and/or NodeJS/VueJS.
  • Snowflake certification.
  • Cloud Analytics certification.
Responsibilities
  • Manages and scales data pipelines from internal and external data sources to support new product launches and drive data quality across data products
  • Facilitates data engineering activities covering data acquisition, extraction, normalization, transformation, management, and manipulation of large and complex data sets
  • Keep abreast of new and current data engineering, big data and data science techniques. Research methods, techniques, and new practices; develop and promote data engineering best practices, standards and guidelines
  • Works closely with subject matter experts to design and develop front end applications with data model and data pipelines supporting the applications
  • Provides governance and oversight of data assets, data environments and relevant data procedures, with the proactive planning and enforcement of data asset naming conventions
  • Builds and maintains data pipelines using big data processing technologies to process and analyze large datasets; uses ETL processes on the internal cluster
  • Develops and maintains data warehouse schema and data models, ensuring data consistency and accuracy
  • Ensures ongoing alignment of technical and business strategies based on changing business and technology drivers and risks
  • Creates advanced visualizations and tools to provide insight, drive action, and support work throughout the business
  • Identify and design Data-as-a-Service candidates for improved data availability and consumption
  • A material job duty of all positions within the Company is ensuring the protection of all its physical, financial and cybersecurity assets, and properly accessing and managing private customer data, proprietary information, confidential medical records, and other types of highly sensitive information and data with the highest standards of conduct and integrity.
Loading...