Data Engineer - Airflow, PySpark, Databricks, Python, Kafka at AstraNorth
Toronto, ON, Canada -
Full Time


Start Date

Immediate

Expiry Date

27 Nov, 25

Salary

0.0

Posted On

28 Aug, 25

Experience

0 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Design, Specifications, Coaching, Technical Direction, Data Engineering, Research, Reliability, Software Systems, Scrum, Containerization, Adherence, Optimization, Technical Leadership, Python, Kanban, Agile Methodologies, Technical Documentation, Architecture, Etl, Docker

Industry

Information Technology/IT

Description

ESSENTIAL SKILLS – JOB DESCRIPTION:

  • Develop and maintain backend systems using Python, PySpark ensuring high performance, scalability, and reliability
  • Participate in the design and implementation of data engineering and ETL pipelines using PySpark
  • Collaborate with cross-functional teams to identify and prioritize project requirements
  • Mentor and guide engineers, providing technical guidance and code reviews
  • Stay up-to-date with the latest technologies and frameworks to improve systems and processes
  • Lead technical direction including architecture, design, and implementation of software systems
  • Collaborate with product owners to define and prioritize requirements
  • Develop and maintain technical documentation including architecture diagrams and specifications
  • Participate in code reviews to ensure code quality and adherence to standards
  • Work with DevOps and Operations teams to ensure smooth deployments and operations
  • Apply expertise in DevOps practices and tools for CI/CD pipelines
  • Show willingness to learn new technologies and adapt to new challenges
  • Proven experience leading technical teams including mentoring and coaching
  • Strong technical leadership including prioritization and managing technical resources
  • Drive technical innovation and recommend solutions through research and evaluation
  • Strong problem-solving skills for debugging, optimization, and quality delivery
  • Experience with Agile methodologies like Scrum or Kanban
  • Experience with cloud platforms (AWS, Azure, GCP)
  • Proficient in PySpark for Data Engineering and ETL pipeline development
  • Familiarity with containerization (Docker) and orchestration (Kubernetes)
    Job Types: Full-time, Fixed term contract
    Work Location: Hybrid remote in Toronto, O
Responsibilities

Please refer the Job description for details

Loading...