Data Engineer at GFL Environmental
Vaughan, ON, Canada -
Full Time


Start Date

Immediate

Expiry Date

16 Nov, 25

Salary

0.0

Posted On

16 Aug, 25

Experience

1 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Kafka, Github, Infrastructure, Sql, Code, Python, Technical Requirements, Apache Spark, Data Processing

Industry

Information Technology/IT

Description

We are seeking a highly skilled Data Engineer with experience in designing and maintaining real-time data streaming pipelines and building robust data lake infrastructure and architecture. The right candidate will be excited by the prospect of optimizing and building data architecture to support our next generation of products and data initiatives.

TECHNICAL REQUIREMENTS:

  • 1-3 years of experience as an intermediate engineer with proficiency in AWS services.
  • Strong programming skills in python and SQL.
  • Strong experience with Apache Spark and Delta Lake for big data processing.
  • Expertise in using Terraform for Infrastructure as Code (IAAC).
  • Proficiency using standard DevOps tool such as Github, Azure DevOps, etc.
  • Experience with Kafka and real-time data streaming pipelines along with geospatial data processing and analysis will be good to have.
Responsibilities
  • Design, develop, and maintain scalable, real-time and batch data pipelines using AWS Glue, Lambda, Apache Spark, and Kafka.
  • Implement data lake architecture to ensure efficient data storage, processing, and retrieval.
  • Collaborate with cross-functional teams to understand data requirements and ensure the data infrastructure supports their needs.
  • Use Terraform to manage and provision infrastructure in a reproducible and scalable manner.
  • Optimize and troubleshoot complex data pipelines, ensuring high availability and performance.
  • Explore and integrate the latest technologies to enhance our data processing capabilities.
  • Work in a fast-paced, startup-like environment where you will take ownership of key projects and contribute to our overall data strategy.
Loading...