Data Engineer at Capgemini
Abu Dhabi, أبو ظبي, United Arab Emirates -
Full Time


Start Date

Immediate

Expiry Date

01 Sep, 25

Salary

0.0

Posted On

01 Jun, 25

Experience

0 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Postgresql, Technology, Aws, Azure, Reliability, Strategy, Cloud, Instrumental, It, Design, Google Cloud Platform, Security, Data Architecture, Spark, Cassandra, Python, Logging, Data Science, Java, Data Solutions, Load, Analytics, Glue, Analytical Skills, Scala, Mongodb

Industry

Information Technology/IT

Description

YOUR SKILLS AND EXPERIENCE

  • Analytical Thinking & Problem Solving: Proven ability to dissect complex problems and provide actionable insights in a timely manner.
  • Proficiency with cloud platforms such as AWS (S3, EC2, EMR, Glue, Lambda), Azure (Data Factory, Blob Storage, HDInsight), or GCP (Big Query, Dataflow, Pub/Sub).
  • Strong programming skills in Python, Java, Scala, or SQL.
  • Experience with big data tools (Hadoop, Spark, Kafka, etc.).
  • Proficiency with ETL processes and data transformation techniques.
  • Knowledge of relational and different databases (PostgreSQL, MongoDB, Cassandra).
Responsibilities

WE ARE SEEKING AN EXPERIENCED AND VISIONARY DATA ENGINEER (CLOUD) TO LEAD OUR DATA-DRIVEN DECISION-MAKING PROCESSES. THIS ROLE REQUIRES SOMEONE WITH A BALANCE OF ANALYTICAL SKILLS, STRATEGIC FORESIGHT, AND THE ABILITY TO COLLABORATE EFFECTIVELY ACROSS DIVERSE TEAMS. THE DATA ENGINEER WILL BE INSTRUMENTAL IN CREATING SCALABLE DATA PIPELINES, OPTIMIZING DATA ARCHITECTURES, AND ENSURING THE OVERALL SECURITY AND RELIABILITY OF CLOUD-BASED DATA SOLUTIONS. THE IDEAL CANDIDATE WILL HAVE A STRONG BACKGROUND IN DATA ARCHITECTURE, ETL PIPELINES, CLOUD PLATFORMS (AWS, AZURE, GCP), AND BIG DATA TECHNOLOGIES.

  • Design, develop, and maintain scalable ETL (Extract, Transform, Load) pipelines using cloud-native tools.
  • Implement best practices for data ingestion, transformation, and loading processes across multiple sources.
  • Automate workflows and develop data pipelines using tools such as Apache Airflow, Glue, Databricks, or Dataflow.
  • Architect, build, and maintain cloud-based data solutions using AWS, Azure, or Google Cloud Platform (GCP).
  • Optimize cloud infrastructure for cost, performance, and security.
  • Implement monitoring, logging, and alerting solutions for data pipelines and cloud infrastructure.
  • Bachelor’s degree in data science, Analytics, Business, or related field. Advanced degrees or certifications are a plus.
Loading...