Senior Data Engineer at FanDuel
Atlanta, Georgia, USA -
Full Time


Start Date

Immediate

Expiry Date

11 Jul, 25

Salary

183700.0

Posted On

12 Apr, 25

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Life Insurance, Dental Insurance, Disability Insurance

Industry

Information Technology/IT

Description

ABOUT FANDUEL

FanDuel Group is the premier mobile gaming company in the United States. FanDuel Group consists of a portfolio of leading brands across mobile wagering including: America’s #1 Sportsbook, FanDuel Sportsbook; its leading iGaming platform, FanDuel Casino; the industry’s unquestioned leader in horse racing and advance-deposit wagering, FanDuel Racing; and its daily fantasy sports product.
In addition, FanDuel Group operates FanDuel TV, its broadly distributed linear cable television network and FanDuel TV+, its leading direct-to-consumer OTT platform. FanDuel Group has a presence across all 50 states and Puerto Rico.
The company is based in New York with US offices in Los Angeles, Atlanta, and Jersey City, as well as global offices in Canada and Scotland. The company’s affiliates have offices worldwide, including in Ireland, Portugal, Romania, and Australia.
FanDuel Group is a subsidiary of Flutter Entertainment, the world’s largest sports betting and gaming operator with a portfolio of globally recognized brands and traded on the New York Stock Exchange (NYSE: FLUT).

THE STATS

What we’re looking for in our next teammate

  • 5+ years of experience using Apache Airflow (ideally via Astronomer or MWAA), including:
  • Designing DAGs that are modular, idempotent, and production-grade.
  • Experience with dynamic DAG generation, task retries, and sensor management.
  • 5+ years of development experience in one or more core programming languages such as Python with strong coding, debugging, and testing skills.
  • 5+ years of hands-on experience with Databricks and/or a equivalent data platform, including:
  • Building and optimizing pipelines (ETL/ELT) in Delta Lake or similar data lake formats.
  • Understanding of Unity Catalog, SQL Warehouses, and performance tuning in Databricks.
  • Strong experience working in cloud environments like AWS, Azure, or GCP, with proficiency in:
  • Cloud-native services (e.g., S3, IAM, Lambda, EventBridge, Secrets Manager).
  • Networking, security, and data storage best practices in the cloud.
  • Experience with infrastructure as code (Terraform preferred) to deploy and manage Airflow/Databricks infrastructure.
  • Familiarity with CI/CD practices, especially around data platform tooling (GitHub Actions, Azure DevOps, or Buildkite).
  • Exposure to data quality, observability, and lineage tools (e.g. Monte Carlo, Datafold).
  • Familiarity with dbt and how it fits into modern data stack with Databricks.
  • Experience building REST APIs or microservices to support data access or platform automation.
Responsibilities

LI-Hybri

Loading...