Data Engineer (m/f/d) at ABOUT YOU GmbH
Lüneburg, Lower Saxony, Germany -
Full Time


Start Date

Immediate

Expiry Date

09 Apr, 26

Salary

0.0

Posted On

09 Jan, 26

Experience

2 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Data Engineering, Airflow, dbt, Snowflake, SQL, Python, CI/CD, Observability, Monitoring, Data Pipelines, Infrastructure, Developer Experience, Cost-Aware Data Operations, ML Workflows, Data Validation

Industry

Internet Marketplace Platforms

Description
Company Description Since 2019 ADFERENCE is a subsidiary of the fashion-tech company ABOUT YOU. We are a fast-growing, innovative tech company that provides online marketing solutions for Google Ads and Amazon. Our tool helps large agencies and brands like Snocks, Spreadshirt and Lampenwelt to optimize their Amazon Advertising and Google Ads campaigns. Job Description We’re looking for a mid-level Data Engineer to strengthen our internal data platform and help product teams ship faster, more reliable data-powered features. You’ll join our Engineering Foundation chapter, collaborating closely with DevOps, product engineers, and other data engineers to build, operate, and evolve the systems that power our products. There will be close collaboration with our team of cross-functional engineers to help create impactful and reliable insights from our data. Your Responsibilities: Design and build new data-powered features for internal and external products, working closely with product and frontend/backend engineers. Develop and maintain scalable, well-documented data pipelines using Airflow and dbt, running on Snowflake and other modern cloud tooling. Create internal tools, APIs, or utilities to make data more accessible and usable across engineering and product teams. Contribute to the architecture and implementation of new data products, from ingestion to modeling to serving. Set up and monitor data quality, freshness, and health, integrating observability into everything you ship. Build and maintain CI/CD workflows for DAGs, dbt models, and platform configuration using GitOps principles. Troubleshoot pipeline issues and performance bottlenecks, and proactively improve resilience and execution speed. Collaborate with product teams to identify opportunities to simplify and scale data workflows. Collaborate on platform improvements like cost-optimization, model run tracking, and efficient use of compute/storage. Qualifications What We’re Looking For Required: 2–4 years of experience in data engineering or platform-oriented backend roles Solid experience with: Airflow DAGs or other task orchestration dbt (Core or Cloud) for data modeling Snowflake or similar cloud data warehouse SQL and Python for scripting and operational logic CI/CD pipelines (e.g. GitHub Actions) Familiar with observability and monitoring (e.g., Datadog, data freshness checks) Comfortable working at the intersection of data pipelines, infrastructure, and developer experience Bonus: Experience in cost-aware data operations or platform governance Exposure to ML workflows and model/data validation patterns

How To Apply:

Incase you would like to apply to this job directly from the source, please click here

Responsibilities
Design and build new data-powered features for internal and external products while collaborating with product and engineering teams. Develop and maintain scalable data pipelines and contribute to the architecture of new data products.
Loading...