Senior Streaming Platform Engineer (Data) at On
London, England, United Kingdom -
Full Time


Start Date

Immediate

Expiry Date

15 Jan, 26

Salary

0.0

Posted On

17 Oct, 25

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Apache Kafka, Apache Flink, Spark Streaming, Cloud Platforms, Kubernetes, Docker, Infrastructure as Code, Terraform, CI/CD Pipelines, GitHub Actions, Observability, New Relic, Prometheus, Grafana, Python, Java, Scala

Industry

Retail

Description
In short We are seeking a highly skilled and motivated Streaming Platform Engineer to join the Data Streaming Platform team. This is a unique hybrid role that combines the disciplines of platform, software, and data engineering to build, scale, and maintain our high-performance, real-time data streaming platform. The ideal candidate should have a passion for architecting robust, scalable systems to enable data-driven products and services at massive scale. Your mission Design, build, and maintain the core infrastructure for our real-time data streaming platform, ensuring high availability, reliability, and low latency. Implement and optimize data pipelines and stream processing applications using technologies like Apache Kafka, Apache Flink, and Spark Streaming. Collaborate with software and data engineering teams to define event schemas, ensure data quality, and support the integration of new services into the streaming ecosystem. Develop and maintain automation and tooling for platform provisioning, configuration management and CI/CD pipelines. Champion the development of self-service tools and workflows that empower engineers to manage their own streaming data needs, reducing friction and accelerating development. Monitor platform performance, troubleshoot issues, and implement observability solutions (metrics, logging, tracing) to ensure the platform's health and stability. Stay up-to-date with the latest advancements in streaming and distributed systems technologies and propose innovative solutions to technical challenges. Your story This is a hybrid role, and we understand that candidates may not have experience with every single technology listed. We encourage you to apply if you have a strong foundation in a majority of these areas. Streaming Platforms & Architecture: Strong production experience with Apache Kafka and its ecosystem (e.g., Confluent Cloud, Kafka Streams, Kafka Connect). Solid understanding of distributed systems and event-driven architectures and how they drive modern microservices and data pipelines. Real-Time Data Pipelines: Experience building and optimizing real-time data pipelines for ML, analytics and reporting, leveraging technologies such as Apache Flink, Spark Structured Streaming, and integration with low-latency OLAP systems like Apache Pinot. Platform Infrastructure & Observability: Hands-on experience with major Cloud Platforms (AWS, GCP, or Azure), Kubernetes and Docker, coupled with proficiency in Infrastructure as Code (Terraform). Experience integrating and managing CI/CD pipelines (GitHub Actions) and implementing comprehensive Observability solutions (New Relic, Prometheus, Grafana) for production environments. Programming Languages: Proficiency in at least one of the following: Python, Typescript, Java, Scala or Go. Data Technologies: Familiarity with data platform concepts, including data lakes and data warehouses.
Responsibilities
Design, build, and maintain the core infrastructure for a real-time data streaming platform. Implement and optimize data pipelines and stream processing applications.
Loading...