Data Engineer at GFL Environmental
Vaughan, ON, Canada -
Full Time


Start Date

Immediate

Expiry Date

06 Dec, 25

Salary

0.0

Posted On

07 Sep, 25

Experience

1 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Apache Spark, Sql, Code, Github, Data Processing, Technical Requirements, Kafka, Infrastructure, Python

Industry

Information Technology/IT

Description

TECHNICAL REQUIREMENTS:

  • 1-3 years of experience as an intermediate engineer with proficiency in AWS services.
  • Strong programming skills in python and SQL.
  • Strong experience with Apache Spark and Delta Lake for big data processing.
  • Expertise in using Terraform for Infrastructure as Code (IAAC).
  • Proficiency using standard DevOps tool such as Github, Azure DevOps, etc.
  • Experience with Kafka and real-time data streaming pipelines along with geospatial data processing and analysis will be good to have.
Responsibilities
  • Design, develop, and maintain scalable, real-time and batch data pipelines using AWS Glue, Lambda, Apache Spark, and Kafka.
  • Implement data lake architecture to ensure efficient data storage, processing, and retrieval.
  • Collaborate with cross-functional teams to understand data requirements and ensure the data infrastructure supports their needs.
  • Use Terraform to manage and provision infrastructure in a reproducible and scalable manner.
  • Optimize and troubleshoot complex data pipelines, ensuring high availability and performance.
  • Explore and integrate the latest technologies to enhance our data processing capabilities.
  • Work in a fast-paced, startup-like environment where you will take ownership of key projects and contribute to our overall data strategy.
Loading...