Data Engineer (Mid-Level) - Streaming Focus (w/m/d) at Billie
Berlin, , Germany -
Full Time


Start Date

Immediate

Expiry Date

19 Sep, 25

Salary

0.0

Posted On

19 Jun, 25

Experience

0 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Production Systems, Airflow, Snowflake

Industry

Information Technology/IT

Description

We are Billie, the leading provider of Buy Now, Pay Later (BNPL) payment methods for businesses, offering B2B companies innovative digital payment services and modern checkout solutions. We are to create a new standard for business payments and have made it our mission to simplify the purchasing experience for all businesses making it a tool for growth. Our solutions are based on proprietary, machine-learning-supported risk models, fully digitized processes and a highly scalable tech platform. This makes us a deep-tech company building financial products, not the other way around. We love building simple and elegant solutions and we strive for automation and scalability.

BONUS SKILLS (NICE TO HAVE):

  • Experience with stream processing frameworks like Apache Flink
  • Familiarity with cloud-based analytical data warehouses (Snowflake, BigQuery, Redshift, Clickhouse)
  • Experience managing infrastructure on cloud platforms (AWS, GCP) using tools like Terraform
  • Knowledge of integrating machine learning models into production systems
  • Experience with data orchestration tools (Airflow, Dagster)
  • Understanding of data transformation frameworks (dbt, SQLMesh)
Responsibilities

ABOUT THE ROLE:

We’re seeking a talented and passionate Mid-Level Data Engineer to join our dynamic team at Billie. You’ll work on our most critical systems, including our ML-powered risk assessment engines and the underlying data infrastructure, with a particular focus on real-time data processing capabilities.
This position offers significant autonomy, allowing you to shape our data engineering strategies and make impactful decisions.

YOUR RESPONSIBILITIES:

  • Design, build, and maintain scalable and reliable real-time data streaming pipelines, with a focus on technologies like Kafka
  • Partner with cross-functional teams (data scientists, analysts, software engineers) to implement data infrastructure that powers critical business decisions
  • Work with stakeholders to understand business priorities and develop solutions that solve actual problems
  • Optimize streaming applications for latency, throughput, and reliability
  • Propose innovative solutions and participate in architectural discussions
  • Ensure system reliability and compliance with data privacy regulations (like GDPR)
Loading...