Data Streaming Engineer at Etihad Airways Australia
Abu Dhabi, Abu Dhabi Emirate, United Arab Emirates -
Full Time


Start Date

Immediate

Expiry Date

06 Feb, 26

Salary

0.0

Posted On

08 Nov, 25

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Data Streaming, Software Engineering, Event-Driven Applications, Confluent Stream Processing, Apache Flink, Kafka, Real-Time Data Ingestion, Data Transformation, Performance Tuning, Data Governance, Schema Management, Java, Scala, Python, Distributed Systems, Microservices

Industry

Airlines and Aviation

Description
Provide Software Engineering delivery, support capabilities and guidance and technical assistance to the customers i.e. guide teams. Design, develop, and deploy event-driven applications using Confluent Stream Processing, Apache Flink, and Kafka. Build real-time data ingestion, transformation, and processing pipelines that meet performance, scalability, and reliability requirements. Collaborate with data engineers, architects, and business stakeholders to define streaming use cases and implement solutions. Implement monitoring, alerting, and performance tuning for streaming applications. Ensure best practices for data governance, schema management, and security in Kafka/streaming ecosystems. Provide support for production systems, troubleshooting issues, and ensuring high availability of event-processing pipelines. Contribute to the low-level architecture/technical design aligned to the high-level solution and business requirements Work within the agile development team to analyse and decompose stories into tasks for simpler implementation Responsible for the end-to-end quality of your own deliverables Batchelor's Degree in related field such as Computer Science, Computer Engineering or Software Engineering. 5+ years of experience in real-time event processing or streaming data engineering. Hands-on expertise in Confluent Platform (Kafka Streams, kSQL, Schema Registry, Connectors). Strong development experience with Apache Flink for stream processing (stateful computations, windowing, event-time processing). Proficiency in programming languages such as Java, Scala, or Python for building Flink/Kafka applications. Solid understanding of distributed systems, event-driven architecture, and microservices. Experience with cloud-based streaming solutions (AWS Kinesis, Azure Event Hubs, GCP Pub/Sub) is a plus. Familiarity with DevOps practices, CI/CD pipelines, containerization (Docker/Kubernetes). Strong problem-solving skills, with the ability to work in a fast-paced, agile environment. Knowledge of complex event pattern detection and advanced analytics over streaming data.

How To Apply:

Incase you would like to apply to this job directly from the source, please click here

Responsibilities
Design, develop, and deploy event-driven applications while providing technical assistance to customers. Collaborate with teams to define streaming use cases and implement solutions, ensuring high availability and performance of event-processing pipelines.
Loading...