Senior Data Engineer (AWS, Big Data) at Commonwealth Bank
Sydney, New South Wales, Australia -
Full Time


Start Date

Immediate

Expiry Date

18 Nov, 25

Salary

0.0

Posted On

19 Aug, 25

Experience

0 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Mapreduce, Data Vault, Data Warehousing, Oracle, Kimball, Rdbms, Languages, Data Integration, Data Models, Kafka, Shell Scripting, Processing, Sqoop, Glue, Scala, Spark, Python, Design Patterns, Stakeholder Management, Teradata, It, Requirements Gathering, Java

Industry

Information Technology/IT

Description

SENIOR DATA ENGINEER (AWS, BIG DATA)

  • You are determined to stay ahead of the latest Cloud, Big Data and Data warehouse technologies.
  • We’re one of the largest and most advanced Data Engineering teams in the country.
  • Together we can build state-of-the-art data solutions that power seamless experiences for millions of customers.

TECHNICAL SKILLS

We use a broad range of tools, languages, and frameworks. We don’t expect you to know them all but experience or exposure with some of these (or equivalents) will set you up for success in this team.

  • Experience in designing, building, and delivering enterprise-wide data ingestion, data integration and data pipeline solutions using common programming language (Scala, Java, or Python) in a Big Data and Data Warehouse platform.
  • Experience in building data solution in Hadoop platform, using Spark, MapReduce, Sqoop, Kafka and various ETL frameworks for distributed data storage and processing.
  • Experience in building data solution using AWS Cloud technology (EMR, Glue, Iceberg, Kinesis, MSK/Kafka, Redshift/PostgresSQL, DocumentDB/MongoDB, S3, etc.).
  • Possess ability to produce conceptual, logical and physical data models using data modelling techniques such as Data Vault, Kimball, 3NF, etc. and demonstrate expertise in design patterns (FSLDM, IBM IFW DW).
  • Strong Unix/Linux Shell scripting and programming skills in Scala, Java, or Python.
  • Proficient in SQL scripting, writing complex SQLs for building data pipelines.
  • Familiarity with data warehousing and/or data mart build experience in Teradata, Oracle or RDBMS system is a plus.
  • Experience in Ab Initio software products (GDE, Co>Operating System, Express>It, etc.) is a plus.

Nice-to-Have :

  • Prior experience as a Tech BA.
  • Skills in requirements gathering, data discovery, and stakeholder management
Responsibilities

Please refer the Job description for details

Loading...