Data Engineer (Sydney/Melbourne)

at  The Data Foundry

MEV3, Victoria, Australia -

Start DateExpiry DateSalaryPosted OnExperienceSkillsTelecommuteSponsor Visa
Immediate04 Jul, 2024Not Specified05 Apr, 20241 year(s) or aboveOperations,Data Classification,Avro,Typescript,Data Security,Technical Requirements,Json,Spark,Data Governance,Bitbucket,Data Analysis,Sql,Jenkins,Software Development Tools,Data Science,Scala,Orc,Data Transformation,Java,Code,Data Engineering,DatabasesNoNo
Add to Wishlist Apply All Jobs
Required Visa Status:
CitizenGC
US CitizenStudent Visa
H1BCPT
OPTH4 Spouse of H1B
GC Green Card
Employment Type:
Full TimePart Time
PermanentIndependent - 1099
Contract – W2C2H Independent
C2H W2Contract – Corp 2 Corp
Contract to Hire – Corp 2 Corp

Description:

Company Description
We are a customer-obsessed, “one stop shop” for all things data-related. We are deliberately narrow in our focus areas but we go deep in each of them. We provide solutions, services and training in the data strategy, data governance, data security, data management, data classification, data wrangling, data lake, data analytics, data visualisation, artificial intelligence, machine learning and IoT areas. Our solutions and services help transform customer organisations by teaching them how to turn their data into insight!
We build on AWS, using the latest AWS services, best practices and reference architectures. We deliberately focus on large public sector customers that are data-heavy but insight-light – federal & state government departments, universities, public sector agencies, public health organisations, and public utilities.
We are an AWS Advanced Consulting Partner and an AWS Public Sector Partner, working towards our Data & Analytics and Government competencies. We are also a registered technology supplier to the Federal, New South Wales, Victorian and Western Australian governments and we have a Master Services Agreement in place with AWS Professional Services.
Job Description
As an AWS Data Engineer, you will play a key role in working with customers to establish cloud infrastructure, build data pipelines and manage their data effectively. You will have considerable experience working on data-related solutions using AWS and will be comfortable working with technical resources including data analysts, data engineers and data scientists, to enable customers to generate insight from their data.
You will relish the opportunity to innovate, evolve and test existing assumptions and design bold solutions. Your experience with large-scale data challenges and ability to create reliable, scalable cloud-based solutions, will see you thriving in this role!

Expectations

  • Enjoy honing your technical skills and developing new ones, so you can make a strong contribution to deep architecture discussions.
  • Regularly take part in further education and training to help you develop high-quality and highly performant solutions.
  • Have a demonstrated ability to think strategically about solutions to business, product, and technical challenges.
  • Possess an innate awareness of technology trends and how they impact the way businesses consume IT.
  • Have a reasonable understanding of IT security and governance including security standards, access policies, and data classification schema.
  • Love what you do and instinctively know how to make work fun
  • Be dynamic, creative, and willing to take on challenges that have the potential to make a big impact.
  • Relish a ground floor opportunity to work in a small, agile, startup environment and play a role in shaping its business and technical goals.
  • A Technical degree required. (Examples include: Computer Science, Data Science, Information Technology or similar)

Qualifications

Technical Requirements:

  • 2+ years of:
  • AWS hands on technical experience
  • Software development tools and methodologies
  • Data engineering-related experience
  • Experience producing high-quality documentation and written communications
  • 1+ years of experience working with DevOps tools such as Bitbucket, Bamboo, Jenkins or Ansible
  • 1+ years of experience working with CI/CD tools integration and operations.
  • 2+ AWS certifications that include either Developer Associate, Solution Architect Associate, SysOps Administrator Associate, Solution Architect Professional, Data Analytics Speciality and/or Machine Learning Speciality.
  • Strong understanding of Infrastructure as Code, especially AWS Cloudformation or CDK.
  • Programming skills in one or more of the following languages: Python, Java, TypeScript, SCALA & Spark.
  • Understanding of SQL, ER diagrams, and data dictionaries to understand and enable the transformation and curation of data.
  • Experience in building real-time or batch ingestion and transformation pipelines.
  • Familiarity with data storage format types such as JSON, AVRO, Parquet, ORC.
  • An understanding of fundamental data concepts including databases, data schema, data classification, data security, data governance, data analysis, data transformation, data engineering and data science.
  • Experience and understanding of container technology, especially Docker (Desirable)
  • An understanding of fundamental networking concepts (Desirable)

Responsibilities:

Please refer the Job description for details


REQUIREMENT SUMMARY

Min:1.0Max:2.0 year(s)

Information Technology/IT

IT Software - DBA / Datawarehousing

Software Engineering

Graduate

Proficient

1

Malvern East VIC 3145, Australia