Senior Data Engineer
at Tridant
Melbourne, Victoria, Australia -
Start Date | Expiry Date | Salary | Posted On | Experience | Skills | Telecommute | Sponsor Visa |
---|---|---|---|---|---|---|---|
Immediate | 23 Apr, 2025 | Not Specified | 23 Jan, 2025 | 5 year(s) or above | Git,Apache Spark,Infrastructure,Version Control,Python,Code,Data Governance,Computer Science,Data Processing,Power Bi,Sql,Agile,Soft Skills,Scrum,Security,Microsoft Azure | No | No |
Required Visa Status:
Citizen | GC |
US Citizen | Student Visa |
H1B | CPT |
OPT | H4 Spouse of H1B |
GC Green Card |
Employment Type:
Full Time | Part Time |
Permanent | Independent - 1099 |
Contract – W2 | C2H Independent |
C2H W2 | Contract – Corp 2 Corp |
Contract to Hire – Corp 2 Corp |
Description:
REQUIRED QUALIFICATIONS & EXPERIENCE
- Bachelor’s degree in Computer Science, Engineering, or related field
- 5+ years of experience in data engineering roles
- Strong expertise in Microsoft Fabric, including Data Factory, Synapse Analytics, and Power BI
- Advanced Python programming skills with experience in data processing libraries (pandas, numpy, pyspark)
- Proven experience with Azure Synapse Analytics, including SQL pools and Spark pools
- Strong understanding of data warehousing concepts and dimensional modelling
- Experience working in Agile environments with Scrum methodology
- Excellent understanding of CI/CD pipelines and DevOps practices
- Strong SQL skills and experience with large-scale data processing
PREFERRED QUALIFICATIONS & EXPERIENCE
- Experience with Databricks and Delta Lake architecture
- Knowledge of data governance and security best practices
- Microsoft Fabric Data Engineer certification and/or Azure Data Engineer Associate certification
- Experience with real-time data processing and streaming technologies
- Familiarity with Infrastructure as Code (IaC) tools
- Experience with version control systems (Git) and collaborative development
- Experience with Generative AI and Large Language Models integration
TECHNICAL SKILLS
- Cloud Platforms: Microsoft Azure, Microsoft Fabric
- Data Processing: Azure Synapse Analytics, Python, SQL
- ETL/ELT Tools: Azure Data Factory, Synapse Pipelines
- Version Control: Git
- Methodologies: Agile, Scrum
- Optional: Databricks, Apache Spark, Generative AI
SOFT SKILLS
- Strong problem-solving and analytical thinking abilities
- Excellent communication and collaboration skills
- Ability to work independently and as part of a team
- Strong project management and organizational skills
- Adaptability and willingness to learn new technologies
Responsibilities:
THE ROLE
We are seeking an experienced Data Engineer or Senior Data Engineer to design, implement, and maintain data solutions using Microsoft Fabric and related technologies. The ideal candidate will bridge the gap between business needs and technical solutions, working within an Agile framework to deliver scalable data platforms.
KEY RESPONSIBILITIES
- Lead the design and implementation of end-to-end data pipelines using Microsoft Fabric, ensuring optimal performance, scalability, and reliability of data infrastructure
- Develop and maintain data transformation processes using Python, implementing best practices for code quality and documentation
- Architect and implement solutions using Microsoft Fabric, creating efficient data warehousing, lakehouse, and analytics solutions
- Collaborate with cross-functional teams using Agile/Scrum methodologies, participating in sprint planning, daily stand-ups, and retrospectives
- Design and implement data quality frameworks and monitoring solutions to ensure data accuracy and reliability
- Mentor junior engineers and contribute to technical decision-making processes
- Optimize existing data pipelines and implement automation solutions for improved efficiency
REQUIREMENT SUMMARY
Min:5.0Max:10.0 year(s)
Information Technology/IT
IT Software - DBA / Datawarehousing
Software Engineering
Graduate
Computer science engineering or related field
Proficient
1
Melbourne VIC, Australia