Senior Data Engineer (REF5241H) at Deutsche Telekom IT Solutions
Budapest, Central Hungary, Hungary -
Full Time


Start Date

Immediate

Expiry Date

15 Jun, 26

Salary

0.0

Posted On

17 Mar, 26

Experience

2 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Data Pipelines, Databricks, MS Fabric, Python, PySpark, SQL, Delta Lake, Lakehouse, CI/CD, Automated Testing, Git, Azure, ADLS Gen2, Azure Data Factory, Unity Catalog, Key Vault

Industry

IT Services and IT Consulting

Description
Company Description As Hungary’s most attractive employer in 2025 (according to Randstad’s representative survey), Deutsche Telekom IT Solutions is a subsidiary of the Deutsche Telekom Group. The company provides a wide portfolio of IT and telecommunications services with more than 5300 employees. We have hundreds of large customers, corporations in Germany and in other European countries. DT-ITS recieved the Best in Educational Cooperation award from HIPA in 2019, acknowledged as the the Most Ethical Multinational Company in 2019. The company continuously develops its four sites in Budapest, Debrecen, Pécs and Szeged and is looking for skilled IT professionals to join its team. Job Description We are looking for an experienced and proactive Senior Data Engineer to join our Data&AI tribe! Here, you will have the opportunity to work on diverse projects using modern data technologies. You won't get bored: exciting challenges await, as our clients come from the telecommunications, automotive, healthcare, and public sectors. If you love working with modern data infrastructures—especially in Databricks and MS Fabric environments—and value writing high-quality code, we would love to have you on the team! What you will do: Build Data Pipelines: Design, develop, and operate scalable data ingestion pipelines using Databricks and Azure services. Lakehouse Management: Structure and maintain Bronze and Silver Delta Lake datasets, providing "transformation-ready" data for analysts and downstream modeling. Python-based Development: Build reusable, production-ready Python frameworks and components (modules, packaging, versioning). Apply Best Practices: Drive CI/CD workflows, automated testing (unit/integration), and code quality standards in your daily work. Reliability & Monitoring: Ensure transparency of data flows (logging, metrics, alerting) and proactively troubleshoot production issues. Teamwork & Mentoring: Collaborate closely with the data team (architects, analysts), foster a culture of knowledge sharing, and mentor fellow engineers. Qualifications We are looking for you if... You have 3–5 years of hands-on experience building data pipelines in production environments. You are confident navigating the Databricks and/or Microsoft Fabric ecosystems. You have strong Python (PySpark) and SQL skills, and you are comfortable with complex data transformations and performance tuning. You have a solid understanding of Delta Lake concepts and Lakehouse architectures. Software engineering principles are in your DNA: you confidently use Git, understand CI/CD processes, and value clean, tested code. You are proactive, reliable, and able to work independently within an agile team. You have strong communication skills in English (spoken and written). Our Tech Stack: Core: Databricks, Microsoft Fabric, Delta Lake Languages: Python, PySpark, SQL Azure Cloud: ADLS Gen2, Azure Data Factory, Azure Functions DevOps & Quality: Git, CI/CD (GitLab/Azure DevOps), automated testing, linting/formatting Governance & Security: Unity Catalog, Key Vault Additional Information * Please be informed that our remote working possibility is only available within Hungary due to European taxation regulation. Company: Deutsche Telekom TSI Hungary Kft.
Responsibilities
The role involves designing, developing, and operating scalable data ingestion pipelines using Databricks and Azure services, while structuring and maintaining Bronze and Silver Delta Lake datasets. Responsibilities also include building production-ready Python frameworks, driving CI/CD workflows, and ensuring data flow transparency through logging and alerting.
Loading...