Senior Data Engineer at KAYAK
Cambridge, Massachusetts, USA -
Full Time


Start Date

Immediate

Expiry Date

09 Oct, 25

Salary

0.0

Posted On

09 Jul, 25

Experience

6 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Python, Sql, Data Architecture, Data Engineering

Industry

Information Technology/IT

Description

KAYAK, part of Booking Holdings (NASDAQ: BKNG), is the world’s leading travel search engine. With billions of queries across our platforms, we help people find their perfect flight, stay, rental car and vacation package. We’re also transforming business travel with a new corporate travel solution, KAYAK for Business.
As an employee of KAYAK, you will be part of a travel company that operates a portfolio of global metasearch brands including momondo, Cheapflights and HotelsCombined, among others. From start-up to industry leader, innovation is at our core and every employee has an opportunity to make their mark. Our focus is on building the best travel search engine to make it easier for everyone to experience the world.
About the Role:
At KAYAK, our mission is to empower everyone to confidently plan their travel. As the world’s leading travel search engine, we process vast amounts of data every day, and our marketing team uses this data to reach millions of travelers across the globe.
We’re looking for a Senior Data Engineer with a solid foundation in building robust data pipelines and a team-player attitude to join our Marketing Data Engineering team. If you thrive in a fast-paced environment, enjoy working across teams, and are eager to adopt change (especially innovations like AI coding assistants), this could be your next adventure.

What you’ll do:

  • Design, build, and maintain high-performance data pipelines and orchestration workflows
  • Write clean, modular Python code to transform, parse, clean, and enrich large datasets
  • Support partners by developing dashboards and visualizations
  • Partner closely with marketing analysts, engineers, and data scientists to define and deliver data needs
  • Actively participate in agile ceremonies, code reviews, and planning discussions
  • Experiment with and use AI coding tools to boost productivity and code quality

Our Tech Stack

  • Languages: Python, SQL
  • Workflow orchestration: Airflow
  • Query engine: Trino
  • Data warehouse: Vertica
  • Source control: Git
  • AI coding tools: Cursor

Your Qualifications and Experience:

  • You have 6+ years of professional experience in data engineering
  • You’re proficient in SQL and Python, and know how to write scalable, maintainable code
  • You’ve worked with AI coding tools and are excited about how they’re shaping the future of development
  • You understand modern data architecture from ingestion to transformation to delivery
  • You’ve built and operated Airflow pipelines (or something similar)
  • You’re used to estimating project scope, managing timelines, and delivering reliably

Soft Skills We Value:

  • You’re an excellent collaborator and communicator, comfortable working with technical and non-technical peers
  • You’re a solution oriented and driven by curiosity
  • You appreciate change and innovation, and you’re quick to adapt your tools and practices
  • You thrive in an international, fast-paced, and feedback driven environment
Responsibilities
  • Design, build, and maintain high-performance data pipelines and orchestration workflows
  • Write clean, modular Python code to transform, parse, clean, and enrich large datasets
  • Support partners by developing dashboards and visualizations
  • Partner closely with marketing analysts, engineers, and data scientists to define and deliver data needs
  • Actively participate in agile ceremonies, code reviews, and planning discussions
  • Experiment with and use AI coding tools to boost productivity and code qualit
Loading...