Data Platform Engineer (f/m/d) at Deutsche Brse
Frankfurt am Main, , Germany -
Full Time


Start Date

Immediate

Expiry Date

18 Oct, 25

Salary

0.0

Posted On

19 Jul, 25

Experience

0 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Good communication skills

Industry

Information Technology/IT

Description

BUILD THE FUTURE OF FINANCIAL MARKETS. BUILD YOURS.

Ready to make a real impact in the financial industry? At Deutsche Börse Group, we’ll empower you to grow your career in a supportive and inclusive environment. With our unique business model, driven by 15,000 colleagues around the globe, we actively shape the future of financial markets. Join our One Global Team!

WHO WE ARE

Deutsche Börse Group is one of the world’s leading exchange organisations and an innovative market infrastructure provider. With our products and services, we ensure that capital markets are fair, transparent, reliable, and stable. Together, we develop state-of-the-art IT solutions and offer our IT systems all over the world. Play a key role in our mission: to create trust in the markets of today and tomorrow.
Frankfurt am Main

YOUR CAREER AT DEUTSCHE BÖRSE GROUP

Your area of work:
Big Data & Advanced Analytics provides data platform and services to enable the development of data science, analytics by the businesses across the value chain served by Deutsche Börse Group. We standardize and automate processes with modern technologies and frameworks.
As part of our cloud migration and digital transformation journey, we are looking for an experienced and passionate Data Platform Engineer with a diverse range of skills who brings fresh and innovative ideas and enjoys designing and building big data solutions with positive attitude.
In this role you will join a growing and diverse big data experts to cover topics ranging from contributing to defining and deploying the overall cloud data architecture including cloud services evaluation, platform design and configuration, to support the implementation of business use cases in the big data and data science fields in one of the biggest exchanges in the world.

Your responsibilities:

  • Provision and configure Databricks workspaces using Terraform, CLI & SDK
  • Set up workspace-level settings including clusters, libraries, and compute policies
  • Define and manage catalogs, schemas, and tables across workspaces
  • Ensure the data platform is operational, secure, scalable, and reliable
  • Contribute to define Data Mesh paradigm with different data domains
  • Act as a technical advisor to data scientists, analysts, and business users
  • Write and maintain technical documentation and perform tasks in Agile methodology

Your profile:

  • Hands on experience with cloud native big data technologies, Azure/GCP
  • Experience in building data pipelines such as Data Factory, Apache Beam, or Apache Airflow
  • Familiarity with at least one data platforms and processing frameworks such as Kafka, Spark, Flink; Delta Lake and Databricks is a big plus
  • Demonstrated experience in one or more programming languages, preferably Python
  • Knowledge in CI/CD tools such as GitHub Actions is a plus
  • Knowledge of data management, monitoring, security, and privacy
  • Strong team player willing to cooperate with colleagues across office locations and functions
  • Very strong English language skills are a must

RECRUITING TEAM

Take your career to the next level with us and embrace new challenges!
+496921111810
Our Recruiting Team is looking forward to your call or e-mail.

Responsibilities
  • Provision and configure Databricks workspaces using Terraform, CLI & SDK
  • Set up workspace-level settings including clusters, libraries, and compute policies
  • Define and manage catalogs, schemas, and tables across workspaces
  • Ensure the data platform is operational, secure, scalable, and reliable
  • Contribute to define Data Mesh paradigm with different data domains
  • Act as a technical advisor to data scientists, analysts, and business users
  • Write and maintain technical documentation and perform tasks in Agile methodolog
Loading...