Data Engineer (SDE 3)

at  carwow

Home Office, Nordrhein-Westfalen, Germany -

Start DateExpiry DateSalaryPosted OnExperienceSkillsTelecommuteSponsor Visa
Immediate08 May, 2025Not Specified08 Feb, 2025N/APython,Devops,Data Engineering,Airflow,Software,Power Bi,Kafka,Ruby,Tableau,Sql,Teams,Dbt,HerokuNoNo
Add to Wishlist Apply All Jobs
Required Visa Status:
CitizenGC
US CitizenStudent Visa
H1BCPT
OPTH4 Spouse of H1B
GC Green Card
Employment Type:
Full TimePart Time
PermanentIndependent - 1099
Contract – W2C2H Independent
C2H W2Contract – Corp 2 Corp
Contract to Hire – Corp 2 Corp

Description:

OUR MISSION

Carwow Group is driven by a passion for getting people into cars. But not just any car, the right car. That’s why we are building the go-to destination for car-changing. Designed to reach drivers everywhere with our trail-blazing portfolio of personality rich automotive brands; Carwow, Auto Express, evo, Driving Electric and Car Buyer.
What started as a simple reviews site, is now one of the largest online car-changing destinations in Europe - over 10m customers have used Carwow to help them buy and sell cars since its inception. Last year we grew over 50% with nearly £3bn worth of cars bought on site, while £1.8bn of cars were listed for sale through our Sell My Car service.
In 2024 we went big and acquired Autovia, doubling our audience overnight. Together we now have one of the biggest YouTube channels in the world with over 1.1 billion annual views, sell 1.2 million print copies of our magazines and have an annual web content reach over 350 million.

KEY REQUIREMENTS

While time spent in data engineering roles is important, we prioritize attitude, aptitude, and the kind of impact you’ve had over years’ experience. Ideally, you:

  • Have significant experience in software or data engineering, preferably in a senior or lead role.
  • Are highly proficient in writing Python and SQL.
  • Have extensive experience building and optimising complex ETL/ELT data pipelines.
  • Have used data transformation tools like dbt.
  • Are skilled in managing and optimising script dependencies with tools like Airflow.
  • Have substantial experience designing and maintaining data warehouses using Snowflake or similar technologies.
  • Experience leading and mentoring junior data engineers, and guiding teams through complex projects.
  • Ability to contribute to the strategic direction of data engineering practices and technologies within the organisation.
  • Nice to have: experience with Terraform, Ruby, data visualisation tools (e.g., Looker, Tableau, Power BI), Amplitude, DevOps, Heroku, Kafka, AWS/GCP, etc.

You’re not expected to be an expert in all of these technologies and tools, we are happy to support your learning journey. If you’re unsure about any of the above, please apply.

Responsibilities:

KEY RESPONSIBILITIES

  • ETL/ELT Data Pipelines: Leading the design, development, and maintenance of robust ETL/ELT data pipelines.
  • SQL Queries: Writing, optimising, and reviewing advanced SQL queries for data extraction and transformation.
  • Data Workflows: Implementing and managing sophisticated data workflows and dependencies using tools like Airflow.
  • Data Models and Warehouses: Designing, building, and maintaining advanced data models and data warehouses using Snowflake or similar technologies.
  • Cross-functional Collaboration: Collaborating with cross-functional teams to understand complex data requirements and deliver efficient solutions.
  • Data Quality: Ensuring high data quality, integrity, and security across the data lifecycle.
  • Process Improvement: Continuously improving data engineering processes and infrastructure, and implementing best practices.

While time spent in data engineering roles is important, we prioritize attitude, aptitude, and the kind of impact you’ve had over years’ experience. Ideally, you:

  • Have significant experience in software or data engineering, preferably in a senior or lead role.
  • Are highly proficient in writing Python and SQL.
  • Have extensive experience building and optimising complex ETL/ELT data pipelines.
  • Have used data transformation tools like dbt.
  • Are skilled in managing and optimising script dependencies with tools like Airflow.
  • Have substantial experience designing and maintaining data warehouses using Snowflake or similar technologies.
  • Experience leading and mentoring junior data engineers, and guiding teams through complex projects.
  • Ability to contribute to the strategic direction of data engineering practices and technologies within the organisation.
  • Nice to have: experience with Terraform, Ruby, data visualisation tools (e.g., Looker, Tableau, Power BI), Amplitude, DevOps, Heroku, Kafka, AWS/GCP, etc


REQUIREMENT SUMMARY

Min:N/AMax:5.0 year(s)

Information Technology/IT

IT Software - DBA / Datawarehousing

Software Engineering

Graduate

Proficient

1

Home Office, Germany