Principal Data Engineer (AWS Cloud, Integration Exp) at Commonwealth Bank
Sydney, New South Wales, Australia -
Full Time


Start Date

Immediate

Expiry Date

17 Sep, 25

Salary

0.0

Posted On

17 Jun, 25

Experience

0 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Security, Programming Languages, Automation Tools, Indexing, Graphql, Architecture, Cloud, Writing, Technology, Adoption, Glue, Scripting, Kafka, Sql, Data Engineering, Availability, Communication Skills, Distributed Systems, Languages, Python, Java, Database Design

Industry

Information Technology/IT

Description

PRINCIPAL DATA ENGINEER (AWS CLOUD, INTEGRATION EXPERIENCE)

  • We’re embarking on an exciting Digital Transformation program and are ready to push the boundaries and deliver engineering best practices to elevate the digital experience of our customers
  • You have knowledge and experience that spans both development and architecture, including data engineering, modelling and cloud architecture
  • Together we will build tomorrow’s bank today, using world-leading engineering, technology, and innovation.

SEE YOURSELF IN OUR TEAM

This role is part of the Data Product Capability Crew within Chief Data Analytic Office (CDAO), where we design and deliver cutting-edge technology solutions that serve data and analytical use-cases.
We are seeking an outstanding Principal Engineer to join our team and help to shape the direction of our data and analytics platforms that we are building.

SKILLS REQUIRED:

We use a broad range of tools, languages, and frameworks. We don’t expect you to know them all but experience or exposure with some of these (or equivalents) will set you up for success in this team;

  • Thorough understanding of large-scale distributed systems, solution design and architecture principles
  • Strong hand-on experience through all stages of SDLC. Advocacy of design-first programming approach to ensure adoption of best engineering practice
  • Ability to design solutions to meet the highest possible quality standards while simultaneously balancing security, performance, availability, and maintainability concerns
  • Sound understanding of Data Modelling concepts (e.g. conceptual/logical/physical modelling)
  • Proficiency in programming languages like Python or Java or others for experimentation and executing quick proof of concepts.
  • Integration experience with exposure to REST APIs, GraphQL, nodeJS, Swagger and KAFKA
  • Experience in AWS services such as Glue, EMR, S3, Redshift and Serverless lambda.
  • Strong knowledge of SQL for writing, optimizing, and debugging queries.
  • Familiarity with database design, indexing, and normalization principles.
  • Proficiency in automation tools and scripting (e.g., bash scripting, cron jobs)
  • Strong interpersonal and communication skills to create positive influence on engineering community across the group and capable of selling ideas to senior technology stakeholders
  • Mentoring and coaching other Engineers.
Responsibilities

Please refer the Job description for details

Loading...