Senior Software Engineer - Data Engineering (SaaS) at CreditorWatch
Sydney NSW 2000, New South Wales, Australia -
Full Time


Start Date

Immediate

Expiry Date

13 May, 25

Salary

0.0

Posted On

13 Feb, 25

Experience

0 year(s) or above

Remote Job

No

Telecommute

No

Sponsor Visa

No

Skills

Data Services, Programming Languages, Sql, Data Warehouse, Data Systems, Python, Infrastructure

Industry

Information Technology/IT

Description

SKILLS:

  • Deep expertise in SQL, Python and Other Object-oriented programming languages.
  • CICD experience, Infrastructure as code tools (terraform), Data Warehouse and Transformation tools (DBT, Databricks)
  • Experience building and optimising data pipelines
  • Ability to design and implement efficient APIs for data delivery
  • Experience with monitoring and observability tools for data systems
  • Proven track record of building production-grade data services
  • Advanced understanding of data modelling methodologies (e.g., Kimball dimensional modelling)
  • Excellent problem-solving and debugging skills
Responsibilities

OUR PURPOSE

✅ Empower Australian businesses to trade confidently with their customers.

YOUR ROLE & TEAM

We are excited to launch a newly created role as a Senior Software Engineer, focusing on Data Engineering.
You will design and implement high-performance data products and services that power customer-facing applications.
As a Senior Software Engineer in the Data Engineering space, you will architect and build end-to-end solutions spanning data pipelines, storage layers, and APIs, with a focus on scalability and real-time performance.
Being a newly created role, you will have the opportunity to make it your own and shape the role.
This role reports directly to the Staff Data Engineer and is a full-time opportunity offering hybrid working conditions based out of our Sydney CBD office.

SOME OF YOUR RESPONSIBILITIES INCLUDE AND ARE NOT LIMITED TO:

  • Design and implement scalable data architectures that integrate seamlessly with product applications, focusing on high-performance API delivery and optimised storage solutions.
  • Architect and build robust SQL and Python data pipelines that handle both batch and real-time processing requirements.
  • Design and implement efficient data models and caching strategies to support low-latency data access patterns.
  • Develop and maintain APIs that serve data products, ensuring high availability and performance.
  • Build data quality frameworks and implement automated validation processes.
  • Write comprehensive technical design documents and contribute to architectural decisions.
  • Implement security and governance controls across the data platform.
  • Evaluate and introduce new technologies that can improve system performance or developer productivity.
Loading...