Senior Data Engineer at Podimo
Berlin, , Germany -
Full Time


Start Date

Immediate

Expiry Date

25 Nov, 25

Salary

0.0

Posted On

25 Aug, 25

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Kubernetes, Scripting Languages, English, Python, Snowflake, Automation, Teamwork, Code, Communication Skills, Continuous Improvement, Orchestration, Entertainment, Cost Control, Learning, Containerization, Data Governance, Docker, Google Cloud Platform

Industry

Information Technology/IT

Description

We are one of the fastest-growing podcast and audiobook platforms in Europe and we are now looking for [insert, who we are looking for and for what reasons].

THIS IS YOU

We are looking for a colleague who is motivated by and takes great pride in building and maintaining innovative and reliable data solutions. You understand the concepts of working in a dynamic and fast-paced startup environment, and you thrive in collaborating cross-functionally with your colleagues across the company. You own the solutions you built and actively make them better.

EXPERIENCE WE’RE LOOKING FOR:

  • Experience with various cloud providers, Google Cloud Platform (GCP) preferred
  • More than 5 years of Data Engineer experience
  • Proficiency in one or more scripting languages, such as Python, for building data pipelines and automation
  • Hands-on experience with setting up and maintaining batch, streaming, and event-driven data setups, as well as workload orchestration tools (e.g., Airflow, Spark, Beam)
  • Proficiency in working with APIs, including the ability to create, maintain services for interacting with various APIs
  • Experience managing platform infrastructure on a cloud provider, including building data pipelines, monitoring systems, applying CI/CD practices, and using IaC tools like Terraform or Pulumi for infrastructure-as-code automation
  • Prior hands-on experience with data platforms such as Google BigQuery or Snowflake
  • Experience with cost control, monitoring and alerting solutions
  • Can propose various solutions to the problem, understand their tradeoffs and be able to pick the best solution
  • Have experience with refactoring data pipelines
  • Experience implementing data governance and compliance best practices
  • Solid SQL skills and foundational understanding of data warehousing principles
  • Proficiency in Docker and Kubernetes for containerization and orchestration
  • A collaborative mindset and a passion for teamwork, continuous improvement, and learning, combined with the ability to work highly independently while ensuring alignment with team goals
  • Excellent communication skills in English (written and verbal), with the ability to articulate complex data concepts clearly and effectively to diverse audiences
    Naturally, being passionate and knowledgeable about podcasts, audio content, media, and entertainment is a big plus!

THIS IS THE HIRING PROCESS

First things first: we ask you to submit your CV or provide a link to your LinkedIn profile. Sharing your motivation for the job is highly appreciated, but not expected. If we pick your application to continue to the interview stage, this is how we plan to run the interview process:

  • An initial conversation with a recruiter to set expectations
  • A technical interview with the hiring manager
  • An interview with CTO / BI & Analytics Engineering stakeholder
  • A coffee chat with some of your potential future team members from Platform and BI

We might tweak the process to keep things smooth and enjoyable. We’ll be conducting interviews continuously, aiming for a start date September 1st.
We are looking forward to hearing from you!

Responsibilities

THIS IS THE ROLE

At Podimo, data is at the centre of nearly everything we do; guiding decisions and fueling our business processes. This is where you come into the picture!
In this position, you will join the Platform team, which provides company-wide infrastructure, internal platforms, and tooling. Your time will be fully dedicated to supporting the central Business Intelligence team, acting as their data engineering counterpart and leading efforts and processes that enable them to deliver reporting and analytics solutions.
You will collaborate with analytics engineers and business teams, and be responsible for ensuring a solid, cost-effective, and scalable technical platform for our BI work.

YOUR MAIN RESPONSIBILITIES IN THIS ROLE:

  • Build and maintain robust, compliant and scalable storage and data ingestion solutions to form the data foundation for business intelligence and analytics
  • Design, develop and maintain pipelines to deliver curated data and insights from our internal data platform to external systems (via APIs or other integrations)
  • Collaborate closely with analytics engineers to understand their needs and meet these with high-quality solutions, driving efficiencies in their work
  • Define, implement, and evolve our data platform architecture and infrastructure, following industry best practices
  • Take ownership of workflow orchestration, including scheduling, coordination, and monitoring & alerting of data processes
  • Collaborate with relevant engineering squads across the Tech department to optimise infrastructure, align practices, and coordinate source system changes
  • Continuously optimise infrastructure and workflows for performance and cost-efficiency
  • Develop and maintain clear documentation of architecture, systems, and processes
Loading...