Principal Data Platform Engineer at Simple Machines
Sydney, New South Wales, Australia -
Full Time


Start Date

Immediate

Expiry Date

05 Apr, 26

Salary

0.0

Posted On

05 Jan, 26

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Python, SQL, Spark, Databricks, Snowflake, AWS, GCP, Data Mesh, Data Products, Data Contracts, Kafka, Flink, Airflow, Terraform, CI/CD, Data Testing, Consulting

Industry

Information Technology & Services

Description
Principal Data Platform Engineer Who We Are Simple Machines is a global, independent technology consultancy operating across Sydney, New Zealand, London, Poland and San Francisco. We design and build modern data platforms, intelligent systems, and bespoke software at the intersection of Data Engineering, Software Engineering and AI. We work with enterprises, scale-ups, and government to turn messy, high-value data into products, platforms, and decisions that actually move the needle. We don’t do generic. We build things that matter - We engineer data to life™. The Role This is a hands-on principal engineering role, not an architecture-only seat and not a support function. You’ll be responsible for technical direction, platform design and architectural decision-making. You'll design and build greenfield data platforms, real-time pipelines, and data products for clients who are serious about using data properly. You’ll work in small, high-calibre teams and operate close to both the problem and the client. If you enjoy solving hard data problems, shaping modern architectures (data mesh, data products, contracts), and delivering real outcomes — this is your lane. What You’ll Be Doing Lead Platform & Architecture Design Own the end-to-end architecture of modern, cloud-native data platforms Design scalable data ecosystems using data mesh, data products, and data contracts Make high-impact architectural decisions across ingestion, storage, processing, and access layers Ensure platforms are secure, compliant, and production-grade by design Build Modern Data Platforms Design and deliver cloud-native data platforms using Databricks, Snowflake, AWS, and GCP Apply modern architectural patterns: data mesh, data products, and data contracts Integrate deeply with client systems to enable scalable, consumer-oriented data access Develop High-Performance Pipelines Build and optimise batch and real-time pipelines Work with streaming and event-driven tech such as Kafka, Flink, Kinesis, Pub/Sub Orchestrate workflows using Airflow, Dataflow, Glue Work at Scale Process and transform large datasets using Spark and Flink Design systems that perform in production - not just on paper Own Data Storage & Performance Work across relational, NoSQL, and analytical stores (Postgres, BigQuery, Snowflake, Cassandra, MongoDB) Optimise storage formats and access patterns (Parquet, Delta, ORC, Avro) Cloud, Security & Governance Implement secure, compliant data solutions with security by design Embed governance without killing developer velocity Consult and Influence Work directly with clients to understand problems and shape solutions Translate business needs into pragmatic engineering decisions Act as a trusted technical advisor, not just an order taker Technical Leadership & Quality Set engineering standards, patterns, and best practices across teams Review designs and code, providing clear technical direction and mentorship Raise the bar on data quality, testing, observability, and operational excellence What We’re Looking For Core Engineering Strength Strong Python and SQL Deep experience with Spark and modern data platforms (Databricks / Snowflake) Solid grasp of cloud data services (AWS or GCP) Architecture & Design Judgement Demonstrated ownership of large-scale data platform architectures Strong data modelling skills and architectural decision-making ability Comfortable balancing trade-offs between performance, cost, and complexity Data Platform Experience Built and operated large-scale data pipelines in production Strong data modelling capability and architectural judgement Comfortable with multiple storage technologies and formats Engineering Discipline Infrastructure-as-code experience (Terraform, Pulumi) CI/CD pipelines using tools like GitHub Actions, ArgoCD Data testing and quality frameworks (dbt, Great Expectations, Soda) Delivery & Consulting Mindset Experience in consulting or professional services environments Strong consulting instincts — able to challenge assumptions and guide clients toward better outcomes Comfortable mentoring senior engineers and influencing technical culture Why Simple Machines You’ll work on interesting, high-impact problems You’ll build modern platforms, not maintain legacy mess You’ll be surrounded by senior engineers who actually know their craft You’ll have autonomy, influence, and room to grow If you’re a senior data engineer who wants to build properly, think clearly, and deliver real outcomes - we should talk.
Responsibilities
Lead the design and architecture of modern, cloud-native data platforms while ensuring they are secure and compliant. Build and optimize high-performance data pipelines and work closely with clients to deliver impactful data solutions.
Loading...