Senior Data Operations Engineer at Bupa
Melbourne, Victoria, Australia -
Full Time


Start Date

Immediate

Expiry Date

29 Jul, 25

Salary

0.0

Posted On

29 Apr, 25

Experience

0 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Orchestration, Scripting Languages, Python, Bash, Containerization, Azure, Powershell, Devops, Data Engineering, Distributed Systems, Docker, Software Development, Computer Science

Industry

Information Technology/IT

Description

OPPORTUNITY SNAPSHOT:

An exciting, 12-month fixed term opportunity has become availablefor a Data Modeller.
Our Data Operations Team is a team of specialist skills that enables Bupa ANZ to execute on its Data Strategy. Collectively, we are working to build the future state of Bupa’s data platforms on Microsoft Azure services, and to enable the business to work strategically with data to maximise competitive performance. Data Operations contribute to the evolution of Bupa’s data estate via squad, product-based teams, and business partnering approaches.
We are growing our DevOps and Data Ops capabilities at Bupa, maturing our practices toward industry leading approaches to proactively solve problems ahead of customer impact.
Our DataOps people have a broad skill set to continue our consolidation and migration journey to our Strategic Data Platform, so skills in Operational Data and Analytics, Data Warehouse and BI, Advanced Analytics and Self-service, and across different business units including Health Insurance, Health Services, and corporate functions are as important as cloud-based skills in Databricks and Azure.
The Senior Data Operations Engineer are embedded and closely coupled with product-based teams to help design and deliver outcomes aligned to embedded business and data platform squads. They also play an important role in the definition of better practices, standards and guidelines. They are also a key contributor towards the continual improvement initiatives of the team.

ABOUT US:

Bupa has a strategic goal of being the most customer-centric digital healthcare organisation, with the use of data as an explicit pillar of this strategy.

Responsibilities
  • Educated to minimum of degree level in engineering, computer science or related technology discipline.
  • 7+ years’ experience in DevOps and/or Data Engineering and/or Software Development
  • Demonstrated experience leading code reviews and improving code quality across a team.
  • Strong background in cloud platforms (AWS, Azure, GCP) with a focus on building scalable, resilient infrastructure for both applications and data workflows.
  • Deep understanding of data engineering principles, including building and managing ETL/ELT pipelines, data lakes solutions (Databricks), and streaming data technologies (e.g., Kafka, Spark)
  • Hands-on experience with containerization (Docker) and orchestration (Kubernetes) for managing large-scale, distributed systems.
  • Proficiency in programming and scripting languages such as Python, PowerShell or Bash for automating infrastructure and data processing tasks.
Loading...