Data Engineer at Procom
Calgary, AB, Canada -
Full Time


Start Date

Immediate

Expiry Date

11 Nov, 25

Salary

0.0

Posted On

12 Aug, 25

Experience

0 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Ddl, Data Migration, Microsoft Azure, Data Architecture

Industry

Information Technology/IT

Description

DATA ENGINEER:

On behalf of our Oil & Gas client, Procom is searching for a Data Engineer for a 7-month role. This position is a onsite at our client’s Calgary office.

DATA ENGINEER - JOB DESCRIPTION:

This project involves upgrading all ETLs by transitioning Informatica-mediated data loading workflows to Azure Data Factory and Azure Databricks. This upgrade will facilitate the IDB migration to an Azure SQL Managed Instance.

DATA ENGINEER - MANDATORY SKILLS:

  • Experienced in developing data engineering pipelines using Azure Data Factory, MS-Azure, and Azure Databricks.
  • Proficient in designing and developing data engineering components of cloud data architecture in MS Azure.
  • Knowledge of Enterprise Cloud Data Architecture in Microsoft Azure.
  • Ability to create and test DDL and DML scripts.
  • Python scripting experience.
  • Experience with Azure DevOps.
  • Strong communication and teamwork skills.

DATA ENGINEER – NICE-TO-HAVE SKILLS:

  • Ability to provide recommendations to support data strategy.
  • Experience with enterprise data migration to cloud-based technologies.
  • Skilled in managing deployable data pipelines in Microsoft Azure environment.
  • Informatica PowerCenter experience.
Responsibilities
  • Work closely with the Practice Lead – Data Engineering to design, implement, and deploy data transformation and consolidation pipelines.
  • Analyze and understand existing Informatica ETL code, then design and rewrite it using Azure Data Factory and Databricks.
  • Collaborate with Subject Matter Experts and Business Analysts to understand rewrite requirements and populate data schemas.
  • Document design and architecture as needed.
  • Conduct testing of data pipelines and support user acceptance testing cycles.
  • Combine data from multiple sources to build the target consolidated state, documenting source-to-target mappings.
  • Identify and document data dependencies.
  • Engage in agile delivery practices, participate in sprint planning, estimation, and retrospectives.
Loading...