Senior Integration Developer at Jade Global
Mississauga, Ontario, Canada -
Full Time


Start Date

Immediate

Expiry Date

01 May, 26

Salary

0.0

Posted On

31 Jan, 26

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Python, Apache Airflow, Data Integration, ETL/ELT, Data Warehouses, Data Lakes, SQL, REST APIs, Cloud Storage, CI/CD, Git, Data Governance, Data Transformation, Debugging, Performance Monitoring

Industry

IT Services and IT Consulting

Description
Senior Integration Developer1 We are seeking a skilled and motivated Data Integration Developer with strong expertise in Python and hands-on experience in building and managing Apache Airflow pipelines. The ideal candidate will be responsible for developing, maintaining, and optimizing data integration workflows that support our enterprise data platform and analytics initiatives. Design, build, and manage scalable and maintainable ETL/ELT pipelines using Apache Airflow and Python. Integrate data from multiple sources (databases, APIs, flat files, cloud services, etc.) into data warehouses or data lakes. Develop reusable Python modules and scripts to support data transformation and validation. Collaborate with data engineers, analysts, and platform teams to understand integration requirements and deliver solutions. Monitor, debug, and improve Airflow DAGs for performance, reliability, and fault tolerance. Ensure data integrity and compliance with enterprise data governance policies. Maintain documentation for integration workflows and operational procedures. Strong proficiency in Python with experience writing modular, testable, and reusable code. Hands-on experience with Apache Airflow (writing custom DAGs, operators, and managing schedules). Experience with data integration, ETL/ELT processes, and working with structured/unstructured data. Familiarity with relational databases (e.g., PostgreSQL, MySQL, SQL Server) and writing efficient SQL queries. Experience integrating with REST APIs, file systems (SFTP/FTP), and cloud storage (S3, Azure Blob, etc.). Understanding of CI/CD practices and version control (e.g., Git). Strong problem-solving skills and attention to detail. Our salary ranges are determined by role, level, and location. The range displayed on each job posting reflects the minimum and maximum target for new hire salaries of the position across all US locations. Within the range, individual pay is determined by work location and additional job-related factors, including knowledge, skills, experience, tenure and relevant education or training. The pay scale is subject to change depending on business needs. Your recruiter can share more about the specific salary range for your preferred location during the hiring process. Additional compensation may include benefits, discretionary bonuses, and equity. Working at Jade Global Talented people are drawn to world-class organizations that offer outstanding opportunities, and Jade Global is an employer of choice for individuals around the world. We invest in each employee’s personal and professional wellbeing because we understand that client success, as well as our ultimate success, starts with our employees. We seek to provide the benefits you need while standing behind you every step of the way. Our programs include health-related policies and leave donation policy.
Responsibilities
The role involves developing, maintaining, and optimizing data integration workflows using Python and Apache Airflow to support enterprise data platforms and analytics initiatives. This includes designing scalable ETL/ELT pipelines, integrating data from various sources, and ensuring data integrity and compliance.
Loading...