Start Date
Immediate
Expiry Date
28 Sep, 25
Salary
60.0
Posted On
29 Jun, 25
Experience
5 year(s) or above
Remote Job
Yes
Telecommute
Yes
Sponsor Visa
No
Skills
Software Development, Optimization Strategies, Computer Science, Etl, Stored Procedures, Data Processing, Automation, Data Architecture, Data Modeling, Sql Server, Python, Regulations
Industry
Information Technology/IT
Job Title: Data Engineer
Contract Duration: 1+ years
Pay range: C$60 - 70/hr
Location: Regina, SK
Work Type: Onsite, Monday to Friday, 8 am to 5 pm.
JOB SUMMARY:
As our client continues to expand its data-driven culture, the role of Data Engineers is critical in enabling scalable, secure, and high-performance data solutions.
Data Engineers work closely with business units and analytics professionals to design, build, and maintain data pipelines and architectures that support advanced analytics, reporting, and operational systems.
This includes integrating data from legacy systems, cloud platforms, and modern data lakehouse environments to ensure data is accessible, reliable, and optimized for use.
DESCRIPTION OF REQUIREMENTS:
In order to be considered - the Data Engineer resource MUST have a minimum of 5 years of recent (in the last 7 years) experience in modern data management principles, such as, but not limited to, ETL, practical data design, architecture, management, modelling, quality, and analytics experience.
The successful candidate will demonstrate a broad and solid understanding of these principles.
EDUCATIONAL & EXPERIENCE REQUIREMENTS:
Degree or Diploma in Computer Science, Engineering, Data Sciences, or a quantitative discipline
5+ years’ recent experience in ETL, data design, data architecture, data management, and data modeling
Relevant job experience in North America
CORE TECHNICAL SKILLS:
SQL Server & SSIS: Expert proficiency with SQL Server (on-premises), including stored procedures, and SSIS package-level deployment.
Data Pipelines: Proven experience designing, creating, and maintaining robust data pipelines and ETL processes.
Monitoring: Skilled in monitoring and troubleshooting database issues to ensure compliance with policies and regulations.
Python for ETL: Advanced Python skills applied to developing ETL processes following software development best practices (including automated testing and code reviews).
Big Data Tools: Proficient in leveraging big data technologies, including PySpark and SparkSQL for large-scale data processing.
Cloud Expertise: Hands-on experience with cloud-based platforms such as Databricks, Azure Data Factory, and Azure Data Lake.
Lakehouse Architecture: Knowledgeable in implementing lakehouse architectures using Delta format and optimization strategies.
API Integration: Experience working with external third-party APIs as ETL sources, including Microsoft Graph APIs, to integrate and automate tasks across Microsoft services.
Automation & Deployment: Familiar with CI/CD processes and tools—including Databricks asset bundles (DABs) for managing workflows—and proficient with version control systems (e.g., Git) for ETL deployments.
MANDATORY REQUIREMENTS (PASS/FAIL):
Candidate must have a Bachelor’s degree or technical Diploma in Computer Science /Engineering, Data Sciences, or a related discipline, or 5 years of related work experience
Candidate must have a minimum of 5 years’ experience within the last 7 years working with modern data management principles
Candidate must be able to work and be located full-time on site at our client’s Head Office in Regina, SK;
Work closely with the Enterprise Analytics team to create and maintain ELT processes.
Assemble large, complex datasets that meet functional / non-functional business requirements;
Consulting DT&S and Business leaders on data and information management practices and governance;
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc..
Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and other technologies;
Work with stakeholders, including the Executive, to assist with data-related technical issues and support their data infrastructure needs.
Create data tools for analytics and data scientist team members that assist them in building and optimizing our business.
Work with data and analytics experts to strive for greater functionality in our data systems.
Championing efforts to improve business performance through enterprise information capabilities, such as master data management (MDM), metadata management, analytics, content management, data integration, and related data management or data infrastructure;
Provide insight into the changing database integration, storage, and utilization requirements for the company and offer suggestions for solutions.
Monitor and understand Information Management trends and emerging technologies