Data Engineer at Helvetic Payroll
Geneva, Geneva, Switzerland -
Full Time


Start Date

Immediate

Expiry Date

11 Jun, 26

Salary

0.0

Posted On

13 Mar, 26

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

GCP, BigQuery, Cloud Composer, Airflow, dbt, GitLab, Terraform, Python, SQL, IAM, CI/CD, ETL/ELT, Data Modeling, Orchestration, Infrastructure as Code, Observability

Industry

Human Resources Services

Description
Data Engineer – GCP Data Platform Start date: 1 April 2026 End date: 31 March 2027 Location: Geneva / Hybrid (depending on project requirements) As part of an ongoing enterprise data transformation program, we are looking for an experienced Data Engineer to support the design, development, and operation of a modern cloud-based data platform on Google Cloud Platform (GCP). The consultant will collaborate closely with data, analytics, and business teams to deliver scalable and reliable data solutions that enable advanced analytics and data-driven decision making. Mission & Responsibilities Design and implement scalable and robust data architectures on Google Cloud Platform, primarily leveraging BigQuery and Cloud Composer (Airflow). Develop and maintain ELT/ETL pipelines ensuring reliability, recoverability, and performance of data workflows. Build analytics-ready data models using dbt (staging, warehouse, and mart layers), including documentation, testing, and best practices. Implement and maintain CI/CD pipelines in GitLab, ensuring code quality, automated deployments, and environment promotion. Manage Infrastructure as Code (IaC) using Terraform to provision and maintain cloud infrastructure, IAM configurations, and repeatable environments. Develop Python components where required (data ingestion jobs, orchestration utilities, data quality checks, automation scripts). Support production operations, including monitoring, alerting, incident resolution, and cost/performance optimization. Collaborate with Analytics Engineers, Data Analysts, and business stakeholders to translate analytical requirements into scalable data products. Technical Environment Cloud Platform Google Cloud Platform (GCP): BigQuery, IAM, Composer / Airflow Data Transformation dbt Core, packages/macros, testing frameworks, documentation DevOps / CI-CD GitLab (merge requests, branching strategy, CI pipelines, runners, secrets management) Infrastructure Terraform (modules, environment management, state management) Development Python (data pipelines, APIs, automation, testing) Required Skills Strong experience delivering data engineering solutions on GCP, particularly with BigQuery and related services. Excellent SQL skills, including performance tuning, partitioning/clustering strategies, and cost optimization. Proven hands-on experience with dbt, including project structuring, macros, testing, documentation, and deployment practices. Experience with workflow orchestration tools such as Airflow / Cloud Composer (or alternatives like Dagster or Prefect). Experience with GitLab CI/CD and collaborative development using Git workflows (code reviews, branching strategies, releases). Practical experience with Terraform for infrastructure provisioning and environment management. Solid understanding of IAM best practices in cloud environments (least privilege access, service accounts, group-based permissions). Proficiency in Python for data engineering use cases (clean code practices, testing, API integration). Strong production mindset, including observability, reliability, and basic security principles. Nice to Have Experience with data observability tools (e.g. Monte Carlo) or custom monitoring frameworks. Familiarity with data ingestion platforms such as Airbyte or Fivetran. Experience working in modern data platform environments supporting advanced analytics or AI initiatives.
Responsibilities
The Data Engineer will design and implement scalable data architectures on Google Cloud Platform, primarily using BigQuery and Cloud Composer, while developing and maintaining reliable ELT/ETL pipelines. Responsibilities also include building analytics-ready data models with dbt and managing Infrastructure as Code using Terraform.
Loading...