Data Engineer (Azure, Databricks), (Remote) - International organisation at The White Team
, , Spain -
Full Time


Start Date

Immediate

Expiry Date

05 Feb, 26

Salary

220.0

Posted On

07 Nov, 25

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Microsoft Azure Cloud, Databricks, PySpark, Python, SQL, Machine Learning, DevOps, CI/CD, Git, Docker, Kubernetes, Bash, Data Engineering, Data Architecture, ETL, Data Integration

Industry

IT Services and IT Consulting

Description
Data Engineer (Azure, Databricks), (Remote) - International organisation Job role: Data Engineer. Minimum experience: 4 to 5 years. Location: Europe (Remote). Studies: Bachelor. Languages: English (C1). DESCRIPTION: We are seeking a highly skilled Data Engineer to join our team, contributing to the design, development, and optimization of data solutions within cloud-based and distributed environments. The ideal candidate will have hands-on experience with Azure Cloud, Databricks, and PySpark, complemented by strong proficiency in SQL, Python, and modern data engineering tools. This role offers the opportunity to work on advanced data integration, analytics, and machine learning projects, ensuring efficient data processing, automation, and deployment of scalable solutions. Competencies: The Data Engineer demonstrates strong analytical and technical capabilities, with the ability to design and implement efficient data architectures and pipelines in cloud environments. They possess solid programming skills in Python and SQL, combined with expertise in Azure and Databricks platforms. The role requires a deep understanding of distributed computing, data modelling, and ETL processes, as well as practical experience with machine learning frameworks and DevOps practices, including CI/CD, version control, and containerization. Strong problem-solving, collaboration, and communication skills are essential to translate business needs into scalable, high-quality data solutions. IT skills: Microsoft Azure Cloud (including Azure DevOps). Databricks, PySpark, DBT. Python and SQL. Pandas, Numpy. Machine learning frameworks such as: Scikit-learn, PyTorch, Keras. Git. Docker, Kubernetes. Bash, Python scripting. Attunity. Continuous Integration / Continuous Deployment (CI/CD). Python Poetry, Databricks notebooks. Distributed computation, software design patterns, prompt engineering, data exploration and analysis, model deployment and monitoring. Language: English (C1). Location: Europe (Remote). Rate: 190-220€/day.
Responsibilities
The Data Engineer will design, develop, and optimize data solutions in cloud-based environments. They will work on data integration, analytics, and machine learning projects to ensure efficient data processing and scalable solutions.
Loading...