Senior Data Platform Engineer (m/f/x) at Daiichi Sankyo Europe
81379 München, , Germany -
Full Time


Start Date

Immediate

Expiry Date

30 Oct, 25

Salary

0.0

Posted On

01 Aug, 25

Experience

0 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Infrastructure, Git, Information Technology, Devops, Data Warehousing, Data Engineering, It, Data Science, Clear Vision, Python, Code, Docker, Linux, Sql

Industry

Information Technology/IT

Description

PASSION FOR INNOVATION. COMPASSION FOR PATIENTS.

With over 120 years of experience and approximately 19,000 employees in more than 30 countries/regions, Daiichi Sankyo is dedicated to discovering, developing, and delivering new standards of care that enrich the quality of life around the world.
In Europe, we focus on two areas: The goal of our Specialty Business is to protect people from cardiovascular disease, the leading cause of death in Europe, and help patients who suffer from it to enjoy every precious moment of life.
In Oncology, we are driving innovation in solid tumours and blood cancers, founded on breakthrough science from our own labs in Japan. We aspire to create better tomorrows for people living with cancer and their loved ones.
Our European headquarters are in Munich, Germany, and we have affiliates in 15 European countries and Canada.
For our headquarter in Munich we are looking for a

PERSONAL SKILLS AND PROFESSIONAL EXPERIENCE

  • Several years in IT with good experience in data engineering, DevOps, or platform engineering roles
  • Strong expertise in SQL, Python, Linux, Docker, Git, and cloud providers (Azure preferred)
  • Understanding of Data Warehousing, Data Pipelines, ETL/ELT, and engineering practices like Infrastructure as Code and CI/CD
  • Excellent communicator and collaborator with internal and external teams, unbiased toward technologies
  • Ability to comprehend new technologies quickly, with intellectual curiosity and integrity
  • Strategic thinker with a clear vision for data practices and engineering; proactive and results-driven
  • Bachelor’s degree in Information Technology, Business Administration, Data Science, or related field
Responsibilities
  • Platform Operations & Automation: Design and maintain data platform infrastructure using Azure, Snowflake, dbt Cloud, and Airflow. Implement Infrastructure as Code with Terraform, manage multi-environment workflows (DEV/UAT/PRD), and develop custom tools for platform management and resource provisioning
  • Developer Experience: Build self-service capabilities and internal tooling (common libraries, shared code, reusable components) to enable 30+ data engineers. Create code generation tools, streamline CI/CD pipelines for 100+ repositories, and establish development environment configuration guidelines, code quality gates, and coding standards
  • Monitoring & Reliability: Implement centralized monitoring, logging, and alerting across the platform stack.
  • Documentation & Knowledge: Create tutorials and runbooks, promote platform standards, help onboard new data engineers, and actively share knowledge across our internal data engineering community. Help growing our team by providing mentoring and promoting standards and best practices.
  • Be in constant communication with team members and other relevant parties and convey results efficiently and clearly
Loading...