Senior Data Engineer I at Bookingcom
1AC, , Netherlands -
Full Time


Start Date

Immediate

Expiry Date

15 Nov, 25

Salary

0.0

Posted On

16 Aug, 25

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Data Modeling, Stakeholder Management, Relational Databases, Coding Standards, Sql, Data Solutions, Scheduling Tools, Design Patterns, Data Vault

Industry

Information Technology/IT

Description

Role Description:
About Us: At Booking.com, data drives our decisions. Technology is at our core. And innovation is everywhere. But our company is more than datasets, lines of code or A/B tests. We’re the thrill of the first night in a new place. The excitement of the next morning. The friends you encounter. The journeys you take. The sights you see. And the memories you make. Through our products, partners and people, we make it easier for everyone to experience the world.
Role: The Senior Data Engineer I is a technical leader who drives data engineering strategies and delivery across teams. The role will lead solution envisaging, technical designs, and hands-on implementation. He/she needs to influence, differentiate, and guide the business and technology strategies, as they relate to data, through constant cross-functional interaction. The role asks the right questions to the right people in order to align data strategy with commercial strategy, demonstrating technical expertise and business knowledge.
In this role, you will innovate and operationalize data pipelines in a modern cloud environment (e.g., AWS, Snowflake), automate workflows (Dagster-centric), manage infrastructure as code (Terraform), and establish robust, auditable CI/CD practices. You’ll partner closely with Data Engineering, FP&A Reporting and analytics teams to deliver timely, reliable, and secure data solutions critical for regulatory (SOx) and business reporting.

QUALIFICATIONS & SKILLS:

  • Minimum of 5 years of experience as a Data Engineer or a similar role, with a consistent record of successfully delivering data solutions.
  • You have built production data pipelines in the cloud, setting up data-lake and server-less solutions; ‌ you have hands-on experience with schema design and data modeling and working with business stakeholders and data engineers to provide production level data solutions.
  • Experience and/or knowledge on designing and implementing mature Data Warehouse pipelines using Data Vault and/or Dimensional modeling methodologies is a must.
  • Working with ETL/ELT tools and methodologies.
  • Working with relational databases and any flavor of SQL in an analytical context.
  • Building data exploration/visualization and designing data story telling.
  • Communicating effectively (written and spoken) and stakeholder management.
  • Writing and maintaining high-quality and reusable code, applying design patterns and meeting coding standards.
  • Comfortable with working in a DevOps / DataOps environment.
  • Have proven records of working with workflow management and scheduling tools such as Apache Airflow and/or Dagster.

How To Apply:

Incase you would like to apply to this job directly from the source, please click here

Responsibilities

KEY JOB RESPONSIBILITIES AND DUTIES:

  • Producing curated, reusable analytical data products to enable self-serve analytics for many internal customers across departments.
  • Modeling data following best practices and Data Warehousing methodologies such as Data Vault and (Kimball) Dimensional modeling.
  • Transforming large, complex data sets into pragmatic, actionable insights and providing them in a consumable format for historical or predictive analysis.
  • Maintaining and tuning data pipeline health, including troubleshooting issues, implementing data quality controls, monitoring performance, and proactively addressing issues and risks.
  • Leading the technical resolution of problems, and communicating them to both technical and non-technical audiences.
  • Supporting product teams in defining the Data Architecture for their domains, from conceptual to physical modeling in the Data Warehouse.
  • Driving the culture across the business unit for data quality and data governance and its best practices.
  • Driving the implementation of reliable and well trusted metrics defined by the business, connecting disparate datasets into unified data products in the Lakehouse and/or Data Warehouse.
  • Performing Data Governance responsibilities such as technical stewardship, data classification, compliance management, data quality monitoring, and security considerations.
  • Working alone and self-steering initiatives, defining and breaking down work for more junior members of the team.
  • Mapping data flows between systems and workflows across the company to improve efficiency and resilience.
  • Developing scalable, real-time event-based streaming data pipelines to support internal and customer-facing use cases.
  • Ensuring ongoing reliability and performance of data pipelines through proactive monitoring, end-to-end testing standards, and incident handling.
  • Writing maintainable, reusable code by applying standard libraries and design patterns, and refactoring for simplicity and clarity.
  • Developing scalable and extensible physical data models aligned with operational workflows and infrastructure constraints.
  • Owning end-to-end data applications by defining and tracking SLIs and SLOs to ensure reliability and quality.

THIS ROLE DOES NOT COME WITH RELOCATION ASSISTANCE.

Booking.com is proud to be an equal opportunity workplace and is an affirmative action employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status. We strive to move well beyond traditional equal opportunity and work to create an environment that allows everyone to thrive

Loading...