Lead Data Engineer

at  DL Remote

13355 Berlin, Gesundbrunnen, Germany -

Start DateExpiry DateSalaryPosted OnExperienceSkillsTelecommuteSponsor Visa
Immediate06 Sep, 2024Not Specified07 Jun, 2024N/AGood communication skillsNoNo
Add to Wishlist Apply All Jobs
Required Visa Status:
CitizenGC
US CitizenStudent Visa
H1BCPT
OPTH4 Spouse of H1B
GC Green Card
Employment Type:
Full TimePart Time
PermanentIndependent - 1099
Contract – W2C2H Independent
C2H W2Contract – Corp 2 Corp
Contract to Hire – Corp 2 Corp

Description:

We are currently staffing a permanent, Lead Data Engineer role @ a leading energy solutions company in Hamburg.
In this role, you will join the Analytics & AI team. As part of this team, you will oversee the development and management of sophisticated data processing systems in the data platform on Azure Cloud. This role involves advancing data solutions on Databricks, optimizing data acquisition via Azure Data Factory, Logic Apps, Azure Functions, Spark and ensuring quality assurance for data pipelines.
You should have 5+ years of experience in Data Engineering, some (lateral) team leadership, and in-depth Azure Data Data Factory & Databricks experience.
Furthermore, you should have high-level Python and SQL skills and a comprehensive understanding of Big Data technologies like Hadoop, Spark or related distributed computing frameworks.

Responsibilities:

IN THIS ROLE, YOU WILL WORK ON THE FOLLOWING TOPICS:

Technical Leadership and Best Practices
Provide strong technical leadership and guidance to a team of data engineers, ensuring adherence to best practices in data engineering, cloud computing, and data security.
Streamlining Data Acquisition
Optimize data acquisition processes using Azure Data Factory, Logic Apps, and Azure Functions to improve efficiency and accuracy.
Quality Assurance
Implement robust validation, error handling, and performance consistency across various data sources and processes to ensure the highest quality of data pipelines.
Scalability and Performance
Oversee and enhance the scalability of data processing systems, ensuring they can manage increasing data volumes and complex processing needs while maintaining high performance.
Cost Optimization
Continuously monitor and optimize cloud resource usage and expenses, implementing cost-effective solutions without compromising performance and scalability.
Automation and CI/CD
Implement and maintain continuous integration and deployment (CI/CD) practices for data pipelines to improve efficiency, reduce manual errors, and accelerate deployment processes.


REQUIREMENT SUMMARY

Min:N/AMax:5.0 year(s)

Information Technology/IT

IT Software - DBA / Datawarehousing

Software Engineering

Graduate

Proficient

1

13355 Berlin, Germany