Data Engineer II at Delivery Hero
Dubai, Dubai, United Arab Emirates -
Full Time


Start Date

Immediate

Expiry Date

08 Jul, 26

Salary

0.0

Posted On

09 Apr, 26

Experience

2 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Data Engineering, Python, SQL, Apache Kafka, Google Cloud Platform, AWS, Apache Airflow, BigQuery, Dataflow, Terraform, Docker, ETL/ELT, Data Modeling, CI/CD, Grafana, Prometheus

Industry

technology;Information and Internet

Description
Company Description Since launching in Kuwait in 2004, talabat, the leading on-demand food and Q-commerce app for everyday deliveries, has been offering convenience and reliability to its customers. talabat’s local roots run deep, offering a real understanding of the needs of the communities we serve in eight countries across the region. We harness innovative technology and knowledge to simplify everyday life for our customers, optimize operations for our restaurants and local shops, and provide our riders with reliable earning opportunities daily. Here at talabat, we are building a high performance culture through engaged workforce and growing talent density. We're all about keeping it real and making a difference. Our 6,000+ strong talabaty are on an awesome mission to spread positive vibes. We are proud to be a multi great place to work award winner. Job Description About the Role We’re looking for a Data Engineer who’s passionate about building reliable, scalable, and cost-efficient data systems. You’ll work with a modern stack, Kafka, Google Cloud Platform (GCP), AWS, to design and maintain the pipelines that power analytics, machine learning, and product insights. Ideal role for someone with solid foundational skills in data engineering who’s ready to deepen their expertise, take ownership of workflows, and collaborate across teams. If you don’t know every tool in our stack yet, that’s okay. We value curiosity, problem-solving, and a willingness to learn just as much as existing technical skills. What's On Your Plate? Design, build, and maintain data pipelines and workflows for batch and streaming use cases. Work with Kafka to manage real-time data ingestion and event-driven architectures. Leverage GCP and AWS services for storage, processing, and orchestration (e.g., BigQuery, Dataflow, S3, Lambda). Orchestrate workflows using tools like Airflow or similar schedulers. Ensure data quality and reliability through monitoring, alerting, and automated validation. Collaborate with analysts, data scientists, and product teams to understand requirements and deliver data solutions that drive business impact. Optimize for cost and performance across cloud environments. Participate in code reviews, documentation, and knowledge sharing to raise the bar for the team. Our Tech Stack Data Ingestion & Streaming: Apache Kafka, Kafka Connect Cloud Platforms: Google Cloud Platform (BigQuery, Dataflow, Pub/Sub, Cloud Storage), AWS (S3, Lambda, Glue) Workflow Orchestration: Apache Airflow Programming Languages: Python, SQL, (bonus: Java/Scala) Infrastructure & DevOps: Terraform, CI/CD pipelines, Docker Monitoring & Observability: Grafana, Prometheus, Cloud-native tools Qualifications What Did We Order? What We’re Looking For Experience (1-3 years) in data engineering, software engineering, or a related field. Proficiency in SQL and at least one programming language (Python preferred). Understanding of data modeling, ETL/ELT concepts, and cloud-based data warehouses. Familiarity with streaming platforms (Kafka, Kinesis, or similar). Comfort working in cloud environments (GCP, AWS, or Azure). Strong communication skills, able to explain technical concepts to non-technical audiences. Growth mindset, eager to learn, adapt, and take on new challenges. Nice-to-Have (But Not Required and willing to learn) Experience with infrastructure-as-code (Terraform, CloudFormation). Exposure to containerization (Docker, Kubernetes). Knowledge of data governance, security, and compliance best practices. Additional Information Why You’ll Love Working Here Impact: Your work will directly influence how data powers decisions across the company. Learning culture: We invest in your growth — from mentorship to training budgets. Modern stack: Work with cutting-edge tools and cloud platforms. Collaboration: Partner with talented engineers, analysts, and product managers. Flexibility: We care about outcomes, not where you work from. Our Hiring Philosophy We know that a great data engineer isn’t defined by checking every box. If you’re excited about data engineering, have a solid foundation, and are eager to grow, we want to hear from you.
Responsibilities
Design, build, and maintain scalable data pipelines and streaming workflows using cloud-native technologies. Collaborate with cross-functional teams to deliver data solutions that drive business impact and ensure high data reliability.
Loading...