Senior Data Engineer at Triglobal
Remote, British Columbia, Canada -
Full Time


Start Date

Immediate

Expiry Date

25 Sep, 25

Salary

110.0

Posted On

23 Aug, 25

Experience

1 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Analytics, Talend, Azure, Databases, Data Quality, Security, Public Sector, Code, Node.Js, It, Apps, Automation, Star, Mongodb, Government, Docker, Containerization, Git, Data Solutions, Data Engineering, Snowflake, Infrastructure, Postgresql, Computer Science, Data Integration

Industry

Information Technology/IT

Description

SENIOR DATA ENGINEER (ETL, CI/CD PIPELINES, GIT)

Requisition #: R25-3355 (GOAPRDJP00000672)
Location: Remote (within Canada)
Engagement Type: Contract
Number of Resources required: 1
Rate (CAD): Up to $110 per hour / Commensurate with related experience and market competitiveness
Term: 2025-10-01 to 2026-09-30 with 24-month extension available (up to 36-month contract)
Hours per day: 7.25
Security Screening: Security Screening: Standard (Criminal Record Check)

Submission Deadline/Closing Date: August 26, 2025 (12:00 PM Mountain Time)

Tri-global Solutions Group Inc. is seeking one (1) Senior Data Engineer to join our talented Service Delivery team at Ministry of Technology and Innovation (Government of Alberta).
WORK MODEL: The successful contractor will be working remotely, though must be available for onsite meetings if required. It is anticipated that this role will be 100% remote. Work must be done from within Canada at all times due to network and data security policies. Applicants must be authorized to work in Canada to apply (Canadian Citizen or Permanent Resident). Standard Hours of work are 08:00 - 16:30 Alberta time, Monday through Friday excluding observed holidays.
Please review the project overview and requirements below. If you meet the requirements and are interested in submitting for this role, please reply to this job posting.
If you know other consultants who may be interested in this opportunity kindly share this job posting.
Thank you.
Tri-global Solutions Group Inc.

Website: https://tri-global.com

PROJECT OVERVIEW

The Government of Alberta (GoA) has embarked on transforming the work of government to deliver simpler, more efficient, and better services for the citizens of Alberta, thereby ensuring that the needs of Albertans are effectively met in the digital age. The Province has a strategic role within government to drive efficiencies, innovation and modernization. The Digital Design and Delivery Division (DDD) is the Province’s new centre for digital delivery. It was established to maximize capability and confidence in modern digital practice by ensuring service quality and value through standards and controls. This includes utilizing human-centred design approaches together with agile methodology and modern data practices.
DDD is currently working with Ministries across the GoA, establishing working relationships with partner Ministries throughout this engagement.
We are seeking one (1) Data Engineer to work with DDD on service innovation, program review, and digital transformation projects across the GoA. Data Engineers will work as part of cross-functional program review or product delivery teams. These teams, led by GoA product owners and DDD work collaboratively and collectively participate in a full range of activities including: field research; backlog definition and refinement; and sprint planning and execution. Digital transformation projects review the current state of services, identify future opportunities, and then deliver new services that are efficient, effective and affordable.
We are seeking talented and versatile Data Engineer(s) to join our dynamic team. The ideal candidate(s) will have a strong foundation in data engineering practices, combined with the analytical skills necessary to derive actionable insights from data. This role involves designing, implementing, and maintaining robust data pipelines and architectures, as well as performing detailed data analysis to support business decisions.

DESCRIPTION OF SERVICES

The Data Engineer(s) will be required on a full-time basis, working across two (2) to three (3) projects.
Services and project deliverables should evolve as the work progresses, in response to emerging user and business needs, as well as design and technical opportunities. However, the following must be delivered (iteratively) over the course of the project:

Data Engineering:

  • Design, build, and maintain data pipelines on-premises and in the cloud (Azure, GCP, AWS) to ingest, transform, and store large datasets. Ensure pipelines are reliable and support multiple business use cases.
  • Create and optimize dimensional models (star/snowflake) to improve query performance and reporting. Ensure models are consistent, scalable, and easy for analysts to use.
  • Integrate data from SQL, NoSQL, APIs, and files while maintaining accuracy and completeness. Apply validation checks and monitoring to ensure high-quality data.
  • Improve ETL/ELT processes for efficiency and scalability. Redesign workflows to remove bottlenecks and handle large, disconnected datasets.
  • Build and maintain end-to-end ETL/ELT pipelines with SSIS and Azure Data Factory. Implement error handling, logging, and scheduling for dependable operations.
  • Automate deployment, testing, and monitoring of ETL workflows through CI/CD pipelines. Integrate releases into regular deployment cycles for faster, safer updates.
  • Manage data lakes and warehouses with proper governance. Apply security best practices, including access controls and encryption.
  • Partner with engineers, analysts, and stakeholders to translate requirements into solutions. Prepare curated data marts and fact/dimension tables to support self-service analytics.

Data Analytics:

  • Analyze datasets to identify trends, patterns, and anomalies. Use statistical methods, DAX, Python, and R to generate insights that inform business strategies.
  • Develop interactive dashboards and reports in Power BI using DAX for calculated columns and measures. Track key performance metrics, share service dashboards, and present results effectively.
  • Build predictive or descriptive models using statistical, Python, or R-based machine learning methods. Design and integrate data models to improve service delivery.
  • Present findings to non-technical audiences in clear, actionable terms. Translate complex data into business-focused insights and recommendations.
  • Deliver analytics solutions iteratively in an Agile environment. Mentor teams to enhance analytics fluency and support self-service capabilities.
  • Provide data-driven evidence to guide corporate priorities. Ensure strategies and initiatives are backed by strong analysis, visualizations, and models.

MANDATORY REQUIREMENTS

  • Bachelor degree in Computer Science, IT or related field of study.
  • Designing efficient dimensional models (star and snowflake schemas) for warehousing and analytics (3 years+)
  • Ensuring data quality, security, and governance. (3 years+)
  • Experience as a Data Analyst, Data Engineer or in a similar role. (5 years+)
  • Experience using Git, collaborative workflows, CI/CD pipelines, containerization (Docker/Kubernetes), and Infrastructure as Code (Terraform, ARM, CloudFormation) to deploy and migrate data solutions. (2 years+)
  • Experience with manipulating and extracting data from diverse on-premises and cloud-based sources. (5 years+)
  • Experience with SSIS, Azure Data Factory (ADF), and using APIs for extracting and integrating data across multiple platforms and applications. (3 years+)
  • Performing migrations across on-premises, cloud, and cross-database environments. (2 years+)

DESIRABLE REQUIREMENTS

  • Experience in application development, with working knowledge of modern technologies including Next.js, Node.js, D3.js, GitHub Actions, and Build Master automation. (2 years+)
  • Experience with databases and data integration, including PostgreSQL, MongoDB, Azure Cosmos DB, Azure Synapse, and Talend. (2 years+)
  • Exposure to AI/ML tools and workflows relevant to data engineering, such as integrating AI-driven analytics or automation within cloud platforms like Databricks and Azure. (1 years+)

How To Apply:

Incase you would like to apply to this job directly from the source, please click here

Responsibilities

Please refer the Job description for details

Loading...