[Google GDC] Data Engineer / Devops
at CAPGEMINI ENGINEERING
Lisboa, Área Metropolitana de Lisboa, Portugal -
Start Date | Expiry Date | Salary | Posted On | Experience | Skills | Telecommute | Sponsor Visa |
---|---|---|---|---|---|---|---|
Immediate | 15 Feb, 2025 | Not Specified | 16 Nov, 2024 | 3 year(s) or above | Design,Data Integrity,Kafka,Transformation,Analytics,Data Processing,Data Engineering,Storage,Python,Data Quality,Data Extraction | No | No |
Required Visa Status:
Citizen | GC |
US Citizen | Student Visa |
H1B | CPT |
OPT | H4 Spouse of H1B |
GC Green Card |
Employment Type:
Full Time | Part Time |
Permanent | Independent - 1099 |
Contract – W2 | C2H Independent |
C2H W2 | Contract – Corp 2 Corp |
Contract to Hire – Corp 2 Corp |
Description:
ABOUT OUR COMPANY
At Capgemini Engineering, the world leader in engineering services, we bring together a global team of engineers, scientists, and architects to help the world’s most innovative companies unleash their potential. From autonomous cars to life-saving robots, our digital and software technology experts think outside the box as they provide unique R&D and engineering services across all industries. Join us for a career full of opportunities. Where you can make a difference. Where no two days are the same.
Responsibilities:
We are seeking a skilled Data Engineer to join our dynamic team. The ideal candidate will be responsible for developing, maintaining, and optimizing our data pipeline architecture while ensuring the availability and performance of critical data workflows. If you are passionate about data engineering, have hands-on experience with cloud platforms and big data technologies, and are a strong problem-solver, this is the role for you.
- Design, build, and maintain scalable, efficient, and reliable data pipelines to support data processing and analytics;
- Implement best practices in Big Data tools and frameworks, ensuring data integrity and performance;
- Collaborate with cross-functional teams to integrate new data sources and optimize data pipelines;
- Work with complex datasets, ensuring data quality, transformation, and storage in BigQuery;
- Troubleshoot data pipeline issues and implement long-term solutions to improve system stability;
- Write and optimize complex SQL queries for data extraction and analysis;
- Utilize Kafka for real-time data streaming and event-driven architecture;
- Implement solutions using Python for data processing and ETL workflows;
- Perform code reviews and maintain high-quality code standards.
REQUIREMENT SUMMARY
Min:3.0Max:4.0 year(s)
Information Technology/IT
IT Software - DBA / Datawarehousing
Software Engineering
Graduate
Proficient
1
Lisboa, Portugal