Data Engineer Senior / Lead at Progressive Casualty Insurance Company
United States, , USA -
Full Time


Start Date

Immediate

Expiry Date

12 Oct, 25

Salary

160000.0

Posted On

13 Jul, 25

Experience

0 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Spark, Sql, Relational Databases, Ec2, Aws, Dbt, Sql Server Integration Services, Python, Snowflake

Industry

Information Technology/IT

Description

Job Number: 254824
Category: Technology
Location: United States
Remote Type: Remote
Job Level: Experienced
Progressive is dedicated to helping employees move forward and live fully in their careers. Your journey has already begun. Apply today and take the first step to Destination: Progress.
As a data engineer senior or lead, you’ll design, build, and maintain robust data pipelines and architectures that support advanced analytics and business intelligence initiatives. You will collaborate with cross-functional teams to gather requirements, optimize data workflows, and ensure data quality and integrity across multiple platforms. Leveraging your expertise in ETL processes, cloud technologies, and big data tools, you will drive the implementation of scalable solutions that enable efficient data access and reporting. Your role will also involve mentoring junior engineers, troubleshooting complex data issues, and contributing to the continuous improvement of data engineering best practices.
This is a remote position for US based work only.

MUST-HAVE QUALIFICATIONS

  • Bachelor’s Degree or higher in an Information Technology discipline or related field of study and minimum of two years of work experience designing, programming, and supporting software programs or applications.
  • In lieu of degree, minimum of four years related work experience designing, programming, and supporting software programs or applications may be accepted.

PREFERRED SKILLS

  • Proven experience building data pipelines for collecting, transforming, and integrating data from diverse sources.
  • Proficiency with cloud platforms and services such as AWS (S3, EC2, Lambda, EMR), Snowflake, and orchestration or transformation tools like HVR, Prefect, and dbt.
  • Strong programming skills in Python, along with expertise in related technology stacks and Unix/Shell scripting.
  • Comprehensive knowledge of the data lifecycle, including ETL/ELT processes, data product deployment, and enabling efficient reporting solutions. Hands-on experience with both relational and non-relational databases, including SQL, PL/SQL, and SQL Server Integration Services.
  • Familiarity with parallel computing tools (e.g., Ray, Prefect, multiprocessing) and performant tabular data manipulation tools (e.g., Pandas, Dask, Spark).
Responsibilities

Please refer the Job description for details

Loading...