Senior Data / Kafka Engineer at ClearPoint Limited
Auckland, Auckland, New Zealand -
Full Time


Start Date

Immediate

Expiry Date

23 Jun, 26

Salary

0.0

Posted On

25 Mar, 26

Experience

10 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Python, Apache Kafka, Data Pipelines, Distributed Data Processing, SQL, Relational Databases, Kubernetes, AWS EKS, Data Modelling, CDC, Data Transformation, Microservices, Streaming Systems, Backend Services, Cloud-Native Infrastructure, Real-time Data Processing

Industry

IT Services and IT Consulting

Description
A bit about us For over 19 years, ClearPoint has been the trusted technology partner for organisations navigating complex digital transformation. We combine software engineering, AI, data and insights, cloud, and human-centred design to deliver meaningful, lasting outcomes. We work with some of New Zealand’s most impactful organisations - and we care deeply about the quality of what we build. Our Auckland head office overlooks the Viaduct (5 minutes from Britomart), with co-working access across NZ and Australia. Through innovative thinking, we strive to make a difference for ourselves, our teams, our clients and our community across New Zealand, Australia and the UK. The ClearPoint culture At ClearPoint, we’re proud of the culture we’ve built - collaborative, curious, and high-performing. Our values aren’t just words: Respect and Care for People Act with Integrity Earn and Nurture Trust You’ll work with sharp, passionate people who take ownership, raise the bar, and genuinely enjoy what they do. Whether you’re in Auckland, Wellington, or Christchurch, you’ll be part of a team that invests in your growth and makes space for you to shine. The Role ClearPoint is seeking Senior Data / Kafka Engineers to support one of our clients on a significant data platform initiative. This is an opportunity to contribute to the development of modern, event-driven data systems, working with streaming technologies and cloud-native infrastructure. You will help design and build scalable data pipelines and services that enable reliable, real-time data processing across the organisation. The environment focuses on Python-based engineering, Kafka streaming platforms, and containerised microservices deployed within AWS. This is a contract role based in Auckland, with a hybrid working model (approximately 2–3 days per week on-site) and an initial contract through to December, with potential for extension. What you’ll be doing You will work within a cross-functional engineering team to build and evolve a modern data platform supporting event-driven architectures and real-time data pipelines. Your responsibilities will include: Designing and developing data pipelines and streaming services using Python Building and maintaining Kafka-based event streaming systems Developing containerised microservices deployed on Kubernetes (AWS EKS) Implementing reliable data ingestion, transformation, and processing pipelines Writing efficient SQL queries and working with large-scale datasets Contributing to data modelling and canonical data structures used across the platform Collaborating with engineers, architects, and platform teams to improve system scalability and reliability The experience we’re looking for We’re looking for experienced engineers who have built production-grade data platforms and streaming systems. Key skills include: Strong commercial experience developing backend or data services in Python Experience working with event streaming platforms such as Apache Kafka Proven ability building data pipelines and distributed data processing systems Strong SQL skills and experience working with relational databases Experience building and deploying services within containerised or Kubernetes-based environments Familiarity with AWS data infrastructure such as EKS, RDS, or similar services Understanding of data modelling, CDC (Change Data Capture), and data transformation patterns Most importantly, we are looking for engineers who combine strong technical depth with a collaborative mindset, and who enjoy solving complex data engineering challenges. What’s next? Sound like the opportunity you’ve been looking for? Please apply and one of our Talent team will be in touch for a confidential chat. You must be eligible to work in New Zealand to be considered for this opportunity. \n \n

How To Apply:

Incase you would like to apply to this job directly from the source, please click here

Responsibilities
The engineer will work within a cross-functional team to build and evolve a modern data platform supporting event-driven architectures and real-time data pipelines. Responsibilities include designing and developing data pipelines and streaming services using Python and building Kafka-based event streaming systems.
Loading...