Data Engineering Specialist at Endeavour Foundation
CHQ4, , Australia -
Full Time


Start Date

Immediate

Expiry Date

27 Nov, 25

Salary

15900.0

Posted On

27 Aug, 25

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Design, Large Enterprise, Version Control, Data Integration, Business Intelligence, Data Engineering, Master Data Management, Data Vault, Data Solutions, Snowflake, Access Control, Kimball, Privacy Regulations, Inmon, Sql, Data Warehouse, Data Architecture, Encryption

Industry

Information Technology/IT

Description

Endeavour Foundation is one of Australia’s largest employers of people living with disability. Our people live and breathe our purpose every day, with flexible working, great perks, and work/life balance. We are dedicated to creating an inclusive culture where our employees can grow, learn, and do their best work. Our purpose is simple: make possibility a reality.

MAKE AN IMPACT

We are seeking a highly capable and experienced Data Engineering Specialist to join our growing Data Analytics and Intelligence team. This role is ideal for a professional who thrives in a collaborative environment, brings deep technical expertise, and is passionate about delivering scalable, high-quality data solutions that drive business outcomes.
You will lead the design and development of robust data pipelines, data lakes, and data warehouse solutions, ensuring data quality, governance, and privacy are embedded throughout. You’ll work closely with analysts, architects, and business stakeholders to deliver trusted, high-performance data platforms.

ESSENTIAL TECHNICAL SKILLS & EXPERIENCE

  • 5+ years of experience in data engineering, data integration, or similar roles.
  • 5+ years proven experience with validated implementation of data warehouse, data lakes or similar data centralisation models with Snowflake, Informatica IDMC, SQL, Python and MS power BI in a large enterprise. Airflow and dbtlabs implementation skills are additionally desirable but not essential.
  • Data Architecture: Experience with design and implementation of data architecture and modelling approaches like Inmon, Kimball, Data Vault.
  • Python programming: Ability to write clean, efficient Python code for data manipulation and analysis.
  • End-to-end data project delivery: Demonstrated success in delivering comprehensive data solutions involving: Data governance, Architecture and modelling, ETL/ELT processes, Data lakes and warehousing, Master Data Management (MDM) and Business Intelligence (BI)
  • Engineering delivery practices: Demonstrated experience with Agile and DevOps methodologies, including Azure DevOps version control, APIs, Containers and Microservices and Data pipeline orchestration, Wiki, Confluence or similar.
  • Data observability: Monitor data freshness, completeness, and schema changes using modern tools.
  • Data governance: Apply frameworks like DAMA-DMBOK and ensure compliance with privacy regulations.
  • Security and access control: Implement encryption, masking, and role-based permissions for sensitive data.
  • Privacy-aware engineering: Design systems that support anonymisation, pseudonymisation, and data minimisation.

PREFERRED EDUCATION OR CERTIFICATIONS (1 OR MORE)

  • Graduate Degree in Data Engineering or Similar
  • SnowPro Advanced Data Engineer or Advanced Administrator
  • Cloud Data Integration Developer – Professional Certification
  • PowerCenter Data Integration Developer – Professional Certification
  • PowerCenter Cloud Edition Developer – Professional Certification
  • PowerCenter Data Integration Administrator – Professional Certification
  • PowerCenter Cloud Edition Administrator – Professional Certification
  • Data Engineering Developer – Professional Certification
  • Microsoft Certified: Power BI Data Analyst Associate
  • Microsoft Certified: Azure Enterprise Data Analyst Associate

How To Apply:

Incase you would like to apply to this job directly from the source, please click here

Responsibilities
  • Develop strong working relationships with business and technical stakeholders across projects.
  • Deliver performant end-to-end data engineering solutions in a collaborative team environment.
  • Escalate issues appropriately and propose practical solutions to ensure delivery quality.
  • Contribute to the growth of the consulting practice through knowledge-sharing and data service initiatives.
  • Design and implement scalable ETL/ELT workflows with built-in validation and error handling.
  • Build and maintain data pipelines using Snowflake, Informatica IICS, Python and other modern data integration tools.
  • Integrate central data stores, data warehouses, and data lakes with analytics platforms like Power BI.
  • Deliver comprehensive data solutions involving governance, architecture, modelling, MDM, BI, and warehousing.
  • Apply Agile and DevOps methodologies including Git, APIs, containers, microservices, and orchestration.
  • Monitor data freshness, completeness, and schema changes using modern observability tools.
  • Implement data governance frameworks (e.g. DAMA-DMBOK) and ensure compliance with privacy regulations.
  • Design systems that support anonymisation, pseudonymisation, and data minimisation.
  • Apply security best practices including encryption, masking, and role-based access control.
Loading...