Data Engineer
at de Bijenkorf
Amsterdam, Noord-Holland, Netherlands -
Start Date | Expiry Date | Salary | Posted On | Experience | Skills | Telecommute | Sponsor Visa |
---|---|---|---|---|---|---|---|
Immediate | 29 Jan, 2025 | ANG 4 Annual | 30 Oct, 2024 | N/A | Functional Support,Data Infrastructure,Kafka,Docker,Data Systems,Data Quality,Data Governance,Google Cloud Platform,Airflow,Availability,Security,Dbt,Collaboration,Operational Efficiency | No | No |
Required Visa Status:
Citizen | GC |
US Citizen | Student Visa |
H1B | CPT |
OPT | H4 Spouse of H1B |
GC Green Card |
Employment Type:
Full Time | Part Time |
Permanent | Independent - 1099 |
Contract – W2 | C2H Independent |
C2H W2 | Contract – Corp 2 Corp |
Contract to Hire – Corp 2 Corp |
Description:
De Bijenkorf is on an exciting journey to become one of the leading data-driven omnichannel retailers in the Netherlands. With millions of customers visiting our stores and e-commerce platform, we have a treasure trove of data at our fingertips. Our goal is to leverage these insights to continuously improve throughout our organization. Want to be a part of this exciting journey?
About the role
As a Data Engineer at de Bijenkorf, you will play a pivotal role in building, optimizing, and maintaining the data infrastructure that powers our decision-making and customer insights. You’ll work with modern tools and technologies, including Google Cloud Platform, BigQuery, DBT, Airflow, and Terraform, to ensure efficient data pipelines and integrations across our systems. Your expertise will help us drive data-driven strategies, maintain robust data systems, and provide critical support for cross-functional teams. This way you work directly contributes to enhancing customer experiences, optimizing operational efficiency, and driving strategic business goals at de Bijenkorf.
- Develop and Optimize Data Pipelines: Build and maintain reliable data pipelines with GCP, BigQuery, and DBT, ensuring data quality, integrity, and availability.
- Data Quality and Governance: Ensure data accuracy, consistency, and security by establishing best practices for data governance, monitoring, and auditing across our ecosystem.
- Collaboration and Cross-Functional Support: Work closely with data analysts, data scientists, and other stakeholders to align on data requirements, support analytical needs, and provide technical expertise across varied high impact projects.
ABOUT THE TEAM
You will join a centralized Data & Analytics team, working closely with data scientists, analytics engineers, analysts, and tracking & measurement experts. The team operates Agile, giving you substantial responsibility and the freedom to make impactful decisions from the start. You’ll be based at our Service Office, with flexible remote work options.
The Data & Analytics team supports a diverse range of organizational departments with their data needs, giving you the chance to work with a variety of data sources and use cases. This role provides a unique opportunity to engage in multiple areas of data engineering.
Who we are looking for
- Strong fundamental experience as a Data Engineer (3+ years of experience)
- Proficiency in Python and SQL
- Strong knowledge of CI/CD
Experience with other aspects of this role is nice to have, but can also be learned on the job.
- While hands-on experience with GCP is a big plus, familiarity with similar cloud is also valuable.
- Experience with DBT is a big plus.
- Experience with Kafka
- Docker
Responsibilities:
As a Data Engineer at de Bijenkorf, you will play a pivotal role in building, optimizing, and maintaining the data infrastructure that powers our decision-making and customer insights. You’ll work with modern tools and technologies, including Google Cloud Platform, BigQuery, DBT, Airflow, and Terraform, to ensure efficient data pipelines and integrations across our systems. Your expertise will help us drive data-driven strategies, maintain robust data systems, and provide critical support for cross-functional teams. This way you work directly contributes to enhancing customer experiences, optimizing operational efficiency, and driving strategic business goals at de Bijenkorf.
- Develop and Optimize Data Pipelines: Build and maintain reliable data pipelines with GCP, BigQuery, and DBT, ensuring data quality, integrity, and availability.
- Data Quality and Governance: Ensure data accuracy, consistency, and security by establishing best practices for data governance, monitoring, and auditing across our ecosystem.
- Collaboration and Cross-Functional Support: Work closely with data analysts, data scientists, and other stakeholders to align on data requirements, support analytical needs, and provide technical expertise across varied high impact projects
Experience with other aspects of this role is nice to have, but can also be learned on the job.
- While hands-on experience with GCP is a big plus, familiarity with similar cloud is also valuable.
- Experience with DBT is a big plus.
- Experience with Kafka
- Docke
REQUIREMENT SUMMARY
Min:N/AMax:5.0 year(s)
Information Technology/IT
Analytics & Business Intelligence
Software Engineering
Graduate
Proficient
1
Amsterdam, Netherlands