Senior Data Engineer at RightsHelper
, , United States -
Full Time


Start Date

Immediate

Expiry Date

01 Jul, 26

Salary

0.0

Posted On

02 Apr, 26

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Python, Apache Airflow, AWS, S3, ECS, Lambda, RDS, Redshift, Docker, PostgreSQL, ETL, Data Pipelines, Data Engineering, Data Validation, Observability, Architecture

Industry

Software Development

Description
Summary We are looking for a hands-on Senior Data Engineer with experience building scalable ETL and data platforms in modern cloud environments. In this role, you will help design and implement the next generation of our data pipelines, enabling our platform to process and analyze growing volumes of ticketing, market, and consumer data. Our existing pipelines contain significant domain expertise and business logic that power our analytics platform. You will work to translate and evolve this logic into a scalable, maintainable architecture using modern engineering practices. This is a high-impact role with significant ownership over how data moves through our platform and supports the machine learning systems that drive our products. Job Function Design and implement the next generation of our core data pipelines Translate existing pipeline logic into a clean, maintainable architecture Develop data pipelines in Python using modern best practices Improve pipeline observability, logging, and failure recovery Work with AWS services such as S3, Lambda, ECS, RDS, and/or Redshift Design and implement data validation, monitoring, and alerting Collaborate closely with leadership to define the future data architecture Document pipeline design and operational workflows Our Tech Stack Python Apache Airflow AWS (S3, ECS, Lambda, RDS, Redshift) Docker PostgreSQL What Success Looks Like By the end of the engagement, you will have helped us: Deliver the next generation of our core ETL pipelines with a scalable and maintainable architecture Implement robust data validation, monitoring and observability across the data platform Enhance pipeline reliability, performance and operational visibility Required Experience 7+ years of software or data engineering experience Proven experience designing and building ETL/data pipelines from scratch Strong expertise with Python for data engineering Deep familiarity with AWS data infrastructure Experience building production-grade pipelines with logging, monitoring, and error handling Nice to Have Experience evolving existing data pipelines into scalable, production-grade data platforms Experience designing data quality / validation frameworks Familiarity with data observability and SLAs Experience with analytics or sports data Experience working in startup or small team environments Why This Role Is Interesting High ownership and architectural influence Work directly with engineering leadership Small team with fast decision making Potential to convert to a full-time leadership role
Responsibilities
Design and implement scalable core data pipelines while evolving existing logic into a maintainable architecture. Collaborate with leadership to define future data architecture and improve pipeline observability, logging, and failure recovery.
Loading...