Data Engineer at Inetum
Bucharest, , Romania -
Full Time


Start Date

Immediate

Expiry Date

22 Dec, 25

Salary

0.0

Posted On

23 Sep, 25

Experience

2 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Data Engineering, Databricks, AWS, Python, R, Spark, Hadoop, Hive, Kafka, Debezium, Golden Gate, Kubernetes, EKS, OpenShift, CI/CD, Linux

Industry

IT Services and IT Consulting

Description
Company Description Our Mission Statement Digital and human resources at the center of the sustainable development of our society. In a world of continuous transformation, accelerated by technological developments and societal challenges, it is necessary to adapt in an ongoing, agile way to meet the challenges of the future. About Inetum Inetum is a European leader in digital services. Inetum's team of 28,000 consultants and specialists strive every day to make a digital impact for businesses, public sector entities and society. Inetum's solutions aim at contributing to its clients' performance and innovation as well as the common good. Present in 19 countries with a dense network of sites, Inetum partners with major software publishers to meet the challenges of digital transformation with proximity and flexibility. Driven by its ambition for growth and scale, Inetum generated sales of 2.5 billion euros in 2023. Job Description Mission: Design, implement, manage, and monitor end-to-end IT solutions for Advanced Analytics and Data Replication (CDC) platforms Ensure the seamless operation of services within SLA performance parameters, with a focus on Databricks and cloud-based solutions Innovate and optimize IT services and systems, emphasizing cloud solutions such as Databricks and AWS Implement Data Products in Data Mesh architecture Support and maintain existing applications, ensuring they are running efficiently and effectively Work closely with the Data team to transition data pipelines into optimized production-ready states for various workflows Administer and manage Data environments, tech stack, and traditional databases Implement automation and CI/CD practices using tools such as Git, Jenkins/GitHub Actions, and Ansible/Terraform Participate in diagnosing and solving complex problems, documenting configurations to maintain best practices across IT infrastructure Qualifications Profile: Bachelor’s or master’s degree in computer science, Engineering, or a related technical field Minimum 3 years of experience in Data Engineering, Data DevOps, or a relevant field, preferably in the banking or telecommunications industry Advanced knowledge in specific Data Engineering technologies: Spark, Hadoop, Hive, Python, R, Databricks, Kafka, Debezium, Golden Gate (2 or more preferred) Strong understanding of networking and microservices architectures like Kubernetes/EKS/OpenShift Experience with Cloud architecture and Big Data solutions, particularly with Databricks and AWS Automation &CI/CD tools (Git, Jenkins/Github Actions, Ansible/Terraform, Shell/Python) Excellent problem-solving skills with the ability to collaborate effectively with various teams Professional certifications would be a plus, especially if you have practical experience with technology Experience in the banking or telecommunications industry Proficient in Linux, with advanced system management and shell scripting skills Knowledge of Service Management and software development methodologies Professional certifications in relevant technologies or practices Additional Information Benefits: Full access to foreign language learning platform Personalized access to tech learning platforms Tailored workshops and trainings to sustain your growth Medical subscription Meal tickets Monthly budget to allocate on flexible benefit platform Access to 7 Card services Wellbeing activities and gatherings
Responsibilities
Design, implement, manage, and monitor end-to-end IT solutions for Advanced Analytics and Data Replication platforms. Ensure the seamless operation of services within SLA performance parameters, focusing on Databricks and cloud-based solutions.
Loading...