Data Engineer at Alberta Securities Commission
Calgary, AB, Canada -
Full Time


Start Date

Immediate

Expiry Date

03 Dec, 25

Salary

0.0

Posted On

04 Sep, 25

Experience

8 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Good communication skills

Industry

Information Technology/IT

Description

Our organization:
The Alberta Securities Commission (ASC) is the industry-funded regulator responsible for administering the province’s securities laws. It is entrusted with fostering a fair and efficient capital market in Alberta and with protecting investors. As a member of the Canadian Securities Administrators (CSA), the ASC works to improve, coordinate and harmonize the regulation of Canada’s capital markets.
The Information Technology (IT) team is responsible for making sure that the ASC has the appropriate resources necessary to ensure the consistent, reliable and secure delivery of services, and understanding and anticipating the organization’s unique technology requirements.
The opportunity:
Reporting to the Manager, Data Management, the Data Engineer is a critical role responsible for designing, building, and maintaining the data pipelines and infrastructure that support data-driven initiatives. As the volume and complexity of our data continues to grow, this role is needed to ensure that data is accessible, reliable, and secure. The Data Engineer will play a pivotal part in enabling data-driven decision-making and driving innovation across the organization.

Key responsibilities include:

  • Developing efficient and scalable data pipelines to ingest/extract, transform, and load data from various sources into our data stores.
  • Identifying and addressing performance bottlenecks in data pipelines and queries to ensure optimal data access.
  • Monitoring data quality and implementing improvements as needed.
  • Working with stakeholders to assist with data-related technical issues and supporting their data infrastructure needs.
  • Developing and maintaining documentation related to data pipeline architecture, development processes and data governance standards.
  • Staying current with industry trends and developments in cloud computing, data management and AI.
  • Developing and updating data integration standards and best practices.
  • Using DevOps Continuous Integration and Delivery (CI/CD) processes.
  • Providing hands-on data integration support as part of daily operations. This includes monitoring, configuration, troubleshooting, user administration, and optimization of data objects for consumers.
  • Working closely with the Data Architect to design and construct highly scalable data management platform.
  • Collaborating with the Data Architect to implement effective data models and schemas.
  • Working closely with cybersecurity team to establish access control in analytics solutions.

The ideal candidate will possess:

  • A college diploma or university degree in computer science, information systems, or computer engineering.
  • A minimum of 8 years of proven experience as a Data Engineer role.
  • Azure or other cloud certification (e.g. Fabric Data Engineer).
  • Other data- or security-related certifications would be considered assets.
  • Strong knowledge of applicable data privacy practices and regulations.
  • Hands-on experience with cloud services, particularly Microsoft Fabric.
  • Strong programming and query language skills in Python, SQL, KQL.
  • Familiarity with federated data governance and self-serve data platforms.
  • Excellent communication skills, with the ability to translate non-technical requirements into technical solutions.
  • Strong organizational skills including attention to detail and multitasking skills
  • A results-oriented and deadline-driven approach.
  • The ability to multi-task, set priorities, coordinate and schedule.
  • Excellent critical thinking, analytical and decision-making capabilities.

To apply:
Click the Apply For This Job Online button to submit your resume, cover letter and salary expectations by September 23, 2025. This position will work out of the ASC office located in Calgary, Alberta. You will be contacted if you are selected for an interview. More information about working at the ASC including our comprehensive Total Rewards package can be found on our website at www.asc.ca.
We offer a hybrid work environment and flexibility, a competitive total rewards package consisting of 100 per cent employer-paid benefits, comprehensive health and dental, employee life insurance, short-term and long-term disability, retirement benefits, travel insurance, paid vacation time, flex and sick days, an employee family assistance program, transportation allowance, generous flexible spending account and professional development through subsidized courses, conference, workshops, seminars and in-house training. We also encourage fun and giving back to the community with initiatives offered through our ASC Social Club and annual United Way Campaign.
The ASC is an equal opportunity employer and encourages applications from all qualified individuals. We celebrate diversity and are committed to providing an inclusive work environment where every employee feels valued and respected

Responsibilities
  • Developing efficient and scalable data pipelines to ingest/extract, transform, and load data from various sources into our data stores.
  • Identifying and addressing performance bottlenecks in data pipelines and queries to ensure optimal data access.
  • Monitoring data quality and implementing improvements as needed.
  • Working with stakeholders to assist with data-related technical issues and supporting their data infrastructure needs.
  • Developing and maintaining documentation related to data pipeline architecture, development processes and data governance standards.
  • Staying current with industry trends and developments in cloud computing, data management and AI.
  • Developing and updating data integration standards and best practices.
  • Using DevOps Continuous Integration and Delivery (CI/CD) processes.
  • Providing hands-on data integration support as part of daily operations. This includes monitoring, configuration, troubleshooting, user administration, and optimization of data objects for consumers.
  • Working closely with the Data Architect to design and construct highly scalable data management platform.
  • Collaborating with the Data Architect to implement effective data models and schemas.
  • Working closely with cybersecurity team to establish access control in analytics solutions
Loading...