Senior Data Platform Engineer (m/f/x)

at  Daiichi Sankyo Europe

81379 München, Bayern, Germany -

Start DateExpiry DateSalaryPosted OnExperienceSkillsTelecommuteSponsor Visa
Immediate15 Jun, 2024Not Specified15 Mar, 2024N/AData Science,Vendors,Tuning,Information Technology,Snowflake,Performance Tuning,Dbt,Pip,Sql,Optimization,Docker,Git,It,Microsoft AzureNoNo
Required Visa Status:
CitizenGC
US CitizenStudent Visa
H1BCPT
OPTH4 Spouse of H1B
GC Green Card
Employment Type:
Full TimePart Time
PermanentIndependent - 1099
Contract – W2C2H Independent
C2H W2Contract – Corp 2 Corp
Contract to Hire – Corp 2 Corp

Description:

PASSION FOR INNOVATION. COMPASSION FOR PATIENTS.

With over 120 years of experience and more than 17,000 employees in over 20 countries, Daiichi Sankyo is dedicated to discovering, developing, and delivering new standards of care that enrich the quality of life around the world.
In Europe, we focus on two areas: The goal of our Specialty Business is to protect people from cardiovascular disease, the leading cause of death in Europe, and help patients who suffer from it to enjoy every precious moment of life. In Oncology, we strive to become a global pharma innovator with competitive advantage, creating novel therapies for people with cancer.
Our European headquarters are in Munich, Germany, and we have affiliates in 13 European countries and Canada.
For our headquarter in Munich we are seeking highly qualified candidates to fill the position of

PROFESSIONAL EXPERIENCE AND EDUCATION:

  • Bachelor’s degree in Information Technology, Business Administration, Data Science, or related field
  • A minimum of 5-7 years in IT, with at least 3 years in the data engineering team
  • Expert in SQL and one of the analytical RDBMS (architecture, performance tuning)
  • Authoritative in designing, coding, and tuning batch and stream data pipelines
  • Must have an excellent understanding of data warehouse methods and procedures and data architectures and technologies
  • Experience in Python development and related tools (pip, venv, debugger, unittest/pytest, etc.)
  • Experience in docker and Linux terminal
  • Cloud experience with Microsoft Azure
  • Experience in using Git (branching, merge conflicts resolution, rebasing etc.)
  • Experience in using and implementing engineering practices: CI/CD, IaC, auto-testing, etc.
  • Have a track record of remaining unbiased toward specific technologies or vendors
  • Ideally experience in performance tuning and cost optimization of cloud-based database solutions (Snowflake, Azure Synapse, Databricks, Google BigQuery)
  • Experience with Apache airflow, Azure Data Factory and dbt is a plus
  • Experience with Azure DevOps and Terraform will be helpful

Responsibilities:

  • Design, implement, and improve data pipelines (batch/real-time) for our data analytics platform
  • Migrate/refactor existing analytical solutions to the modern data stack (dbt, airflow, snowflake)
  • Design, implement, and improve libraries, software components, and ETL/ELT frameworks, required to integrate our data analytics platform with other company/external services
  • Integrate our tools into seamless and efficient processes, specifically customize and improve ci/cd pipelines, IaC scripts, and other platform process automation tools
  • Implement complex analytical projects, across different departments and corporate systems with a focus on collecting, transforming, and delivering data
  • Develop standards, guidelines, and other necessary documentation for data analytics platform
  • Be in constant communication with team members and other relevant parties and convey results efficiently and clearly


REQUIREMENT SUMMARY

Min:N/AMax:5.0 year(s)

Information Technology/IT

IT Software - DBA / Datawarehousing

Software Engineering

Graduate

Information technology business administration data science or related field

Proficient

1

81379 München, Germany