DataOps Engineer (m/f/x) at ALDI SD
4MADR, Nordrhein-Westfalen, Germany -
Full Time


Start Date

Immediate

Expiry Date

11 Sep, 25

Salary

0.0

Posted On

11 Jun, 25

Experience

3 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Good communication skills

Industry

Information Technology/IT

Description

Info text
At ALDI DX, we develop innovative digital products and services for our employees as well as our customers in 11 ALDI SÜD countries and over 7,300 ALDI SÜD stores worldwide. We drive digital value to offer great quality at the lowest price.
We will be guided along the way by the three core values of the ALDI SÜD Group – simplicity, reliability and responsibility. Our team and our performance are also at the heart of everything we do at ALDI DX.
Your Job
What you give your best for.
Monitoring, restarting, analysing, fixing and improving existing data pipelines between source systems and the data lake in both directions
Communicating the impact of service degradations with data lake user community and internal service management team
Handling incident and problem management for the team
Observing, controlling and optimising the cluster configuration (i.e. setup, version, credentials) in collaboration with the cloud team
Developing and maintaining squad-specific data architecture and pipelines that adhere to defined ETL and data lake principles
Solving technical data problems that help the business area achieve its goals
Proposing and contributing to education and improvement plans for IT operations capabilities, standards, tools and processes Your Profile
What you should have.
Background in computer science
Three years of experience in an IT operations role, working with solutions in distributed computing, big data and advanced analytics
Expertise in SQL, data analysis and at least one programming language (e.g. Python)
Understanding of database administration, ideally using Databricks/Spark and SQL Server DB, as well as knowledge of relational, NoSQL and cloud database technologies
Proficiency in distributed computing and the underlying concepts, preferably Spark and MapReduce
Familiarity with Microsoft Azure tools, e.g. Azure Data Factory, Azure Databricks, Azure Event Hub
Operational knowledge of ETL, scheduling, reporting tools, data warehousing as well as structured and unstructured data
Familiarity with the Unix operating system, especially shell scripting
Basic understanding of network level problems and connectivity requirements
Excellent communication skills and business fluency in English; knowledge of German is a plus Your Benefits
How we value your work.
Mobile working within Germany and flexible working hours
State-of-the-art technologies
Attractive remuneration as well as holiday and Christmas bonuses
Future-oriented training and development
Modular onboarding and buddy
Health activities Your Tech Stack
What you work with, among other things.
Azure Databricks
Azure Data Factory
Python
PySpark
ServiceNow
M365
Many more depending on the jo

Responsibilities

Please refer the Job description for details

Loading...