Data Engineer at de Bijenkorf
Amsterdam, , Netherlands -
Full Time


Start Date

Immediate

Expiry Date

10 Dec, 25

Salary

3.252

Posted On

12 Sep, 25

Experience

0 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Google Cloud Platform, Data Governance, Availability, Kafka, Data Quality, Collaboration, Operational Efficiency, Functional Support, Data Systems, Airflow, Data Infrastructure, Docker, Dbt, Security

Industry

Information Technology/IT

Description

At de Bijenkorf, data plays a central role in how we operate and make decisions, and we continue to expand our capabilities every day. With millions of customers visiting our stores and e-commerce platform, we have access to a wealth of valuable data. We use insights to support smarter decision-making and create impact across all areas of our business. From customer experience to logistics, assortment, and marketing. Want to help shape what’s next?
About the role

As a Data Engineer at de Bijenkorf, you will play a pivotal role in building, optimizing, and maintaining the data infrastructure that powers our decision-making and customer insights. You’ll work with modern tools and technologies, including Google Cloud Platform, BigQuery, DBT, Airflow, and Terraform, to ensure efficient data pipelines and integrations across our systems. Your expertise will help us drive data-driven strategies, maintain robust data systems, and provide critical support for cross-functional teams. This way you work directly contributes to enhancing customer experiences, optimizing operational efficiency, and driving strategic business goals at de Bijenkorf.

  • Develop and Optimize Data Pipelines: Build and maintain reliable data pipelines with GCP, BigQuery, and DBT, ensuring data quality, integrity, and availability.
  • Data Quality and Governance: Ensure data accuracy, consistency, and security by establishing best practices for data governance, monitoring, and auditing across our ecosystem.
  • Collaboration and Cross-Functional Support: Work closely with data analysts, data scientists, and other stakeholders to align on data requirements, support analytical needs, and provide technical expertise across varied high impact projects.

ABOUT THE TEAM

You will join a centralized Data & Analytics team, working closely with data scientists, analytics engineers, analysts, and tracking & measurement experts. The team operates Agile, giving you substantial responsibility and the freedom to make impactful decisions from the start. You’ll be based at our Service Office, with flexible remote work options.
The Data & Analytics team supports a diverse range of organizational departments with their data needs, giving you the chance to work with a variety of data sources and use cases. This role provides a unique opportunity to engage in multiple areas of data engineering.

Who we are looking for

  • A Computer Science (or similar) background
  • Strong fundamental experience as a Data Engineer (3+ years of experience)
  • Proficiency in Python and SQL

Experience with other aspects of this role is nice to have, but can also be learned on the job.

  • While hands-on experience with GCP is a big plus, familiarity with similar cloud is also valuable.
  • Experience with DBT is a big plus.
  • Experience with the following: Kafka, Docker, Airflow.
Responsibilities

As a Data Engineer at de Bijenkorf, you will play a pivotal role in building, optimizing, and maintaining the data infrastructure that powers our decision-making and customer insights. You’ll work with modern tools and technologies, including Google Cloud Platform, BigQuery, DBT, Airflow, and Terraform, to ensure efficient data pipelines and integrations across our systems. Your expertise will help us drive data-driven strategies, maintain robust data systems, and provide critical support for cross-functional teams. This way you work directly contributes to enhancing customer experiences, optimizing operational efficiency, and driving strategic business goals at de Bijenkorf.

  • Develop and Optimize Data Pipelines: Build and maintain reliable data pipelines with GCP, BigQuery, and DBT, ensuring data quality, integrity, and availability.
  • Data Quality and Governance: Ensure data accuracy, consistency, and security by establishing best practices for data governance, monitoring, and auditing across our ecosystem.
  • Collaboration and Cross-Functional Support: Work closely with data analysts, data scientists, and other stakeholders to align on data requirements, support analytical needs, and provide technical expertise across varied high impact projects

Experience with other aspects of this role is nice to have, but can also be learned on the job.

  • While hands-on experience with GCP is a big plus, familiarity with similar cloud is also valuable.
  • Experience with DBT is a big plus.
  • Experience with the following: Kafka, Docker, Airflow
Loading...