AWS Data Engineer

at  Capgemini

London, England, United Kingdom -

Start DateExpiry DateSalaryPosted OnExperienceSkillsTelecommuteSponsor Visa
Immediate02 Feb, 2025Not Specified03 Nov, 20245 year(s) or aboveData Storage Technologies,Python,Data Warehouse,Programming Languages,Java,Scala,HadoopNoNo
Add to Wishlist Apply All Jobs
Required Visa Status:
CitizenGC
US CitizenStudent Visa
H1BCPT
OPTH4 Spouse of H1B
GC Green Card
Employment Type:
Full TimePart Time
PermanentIndependent - 1099
Contract – W2C2H Independent
C2H W2Contract – Corp 2 Corp
Contract to Hire – Corp 2 Corp

Description:

THE JOB YOU’RE CONSIDERING

The Cloud Data Platforms team is part of the Insights and Data Global Practice and has seen strong growth and continued success across a variety of projects and sectors. Cloud Data Platforms is the home of the Data Engineers, Platform Engineers, Solutions Architects and Business Analysts who are focused on driving our customers digital and data transformation journey using the modern cloud platforms. We specialise on using the latest frameworks, reference architectures and technologies using AWS, Azure and GCP.

YOUR SKILLS AND EXPERIENCE

  • Proficiency with AWS Tools: Demonstrable experience using AWS Glue, AWS Lambda, Amazon Kinesis, Amazon EMR , Amazon Athena, Amazon DynamoDB, Amazon Cloudwatch, Amazon SNS and AWS Step Functions.
  • Programming Skills: Strong experience with modern programming languages such as Python, Java, and Scala.
  • Expertise in Data Storage Technologies: In-depth knowledge of Data Warehouse, Database technologies, and Big Data Eco-system technologies such as AWS Redshift, AWS RDS, and Hadoop.
  • Experience with AWS Data Lakes: Proven experience working with AWS data lakes on AWS S3 to store and process both structured and unstructured data sets.

Responsibilities:

We are looking for strong AWS Data Engineers who are passionate about Cloud technology. Your work will be to:

  • Design and Develop Data Pipelines: Create robust pipelines to ingest, process, and transform data, ensuring it is ready for analytics and reporting.
  • Implement ETL/ELT Processes: Develop Extract, Transform, Load (ETL) or Extract, Load, Transform (ELT) workflows to seamlessly move data from source systems to Data Warehouses, Data Lakes, and Lake Houses using Open Source and AWS tools.
  • Adopt DevOps Practices: Utilize DevOps methodologies and tools for continuous integration and deployment (CI/CD), infrastructure as code (IaC), and automation to streamline and enhance our data engineering processes.
  • Design Data Solutions: Leverage your analytical skills to design innovative data solutions that address complex business requirements and drive decision-making.


REQUIREMENT SUMMARY

Min:5.0Max:10.0 year(s)

Information Technology/IT

IT Software - Other

Software Engineering

Graduate

Proficient

1

London, United Kingdom