Data Engineer at Sensys Gatso Group
Amsterdam, Noord-Holland, Netherlands -
Full Time


Start Date

Immediate

Expiry Date

15 Jul, 25

Salary

0.0

Posted On

14 Jun, 25

Experience

0 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Good communication skills

Industry

Information Technology/IT

Description

INTRODUCTION

Sensys Gatso, a pioneer in traffic enforcement technology, is actively seeking a skilled and motivated Data Engineer to join our team in Amsterdam. This role offers a unique opportunity to contribute to life-saving traffic solutions by designing and implementing robust data pipelines, ensuring our data warehouse provides accurate, timely, and actionable insights.
As our Data Engineer, you will play a key role in extracting and transforming data from our applications - Puls, Xilium, and Flux - into our Snowflake data warehouse. You’ll work with MySQL and Postgres databases, as well as additional data sources like plain text files, to build reliable, scalable data pipelines. Our data transformations are powered by dbt, ensuring maintainable and modular ELT workflows. Additionally, you’ll help design and implement streaming data ingestion paths using Kafka-like technologies, enabling real-time data processing where needed. Beyond pipeline development, you’ll also contribute basic data analytics skills to support reporting initiatives, primarily using Looker.
This position offers a chance to make a meaningful impact, empowering better decisions through data. While based in our vibrant Amsterdam office, your role demands flexibility, including a minimum two-day in-office work week and occasional collaboration with our global Sensys Gatso teams.

MORE ABOUT US

Sensys Gatso Group is the leading supplier of system solutions for traffic safety in the field of traffic enforcement systems. Sensys Gatso has subsidiaries in Australia, Germany, the Netherlands, Sweden, the USA, Latin America, and a branch office in the United Arab Emirates. The Sensys Gatso Group’s shares are listed at NASDAQ OMX Stockholm.
The product lines at Sensys Gatso consists of three products Flux, Puls and XIlium. Flux is the part that is placed at the roadside and acts as an edge device in a distributed traffic enforcement system, including both soft- and hardware. Puls is the data backend where data from roadside systems is collected, analyzed and enriched. Xilium acts as the frontend where processing on violations is handled. The products are all written using modern technologies and frameworks including containerization with Docker and management using Kubernetes. Flux and Puls are mainly written in golang whereas Xilium as an enterprise software is Java based.
Each product has their own dedicated product manager, and together they form the foundation of a strong product driven development team.

Responsibilities
  • Build Data Pipelines: Develop and maintain data pipelines to extract, load, and transform (ELT) data from multiple sources (MySQL, Postgres, and plain text files) into our Snowflake data warehouse.
  • Implement Data Transformations: Utilize dbt to define, manage, and optimize transformations within Snowflake, ensuring modular, testable, and scalable data models.
  • Enable Streaming Data Ingestion: Design and implement streaming data ingestion paths using Kafka-like technologies, allowing for real-time data processing and event-driven architectures.
  • Combine and Integrate Data: Ensure data from Puls, Xilium, and Flux is integrated seamlessly into the warehouse, delivering a single source of truth for analysis and reporting.
  • Optimize Data Workflows: Continuously improve pipeline performance, data quality, and scalability to ensure that data remains reliable and up-to-date.
  • Support Reporting and Analytics: Collaborate with stakeholders to provide foundational analytics and produce reports in Looker. Assist in defining data models and queries to enable insightful reporting.
  • Establish Best Practices: Implement and uphold industry best practices in data engineering, ensuring robust documentation, version control, and security measures.
  • Collaborate Across Teams: Work closely with engineering and product teams to understand data requirements, ensuring that the data pipeline design supports the company’s broader goals.
Loading...