Senior Data Engineer at Princeton IT Services
Calgary, AB, Canada -
Full Time


Start Date

Immediate

Expiry Date

05 Dec, 25

Salary

50.0

Posted On

06 Sep, 25

Experience

1 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Information Systems, Capital Markets, Snowflake, Computer Science, Python, Dbt, Aws, Data Modeling, Analytical Skills, Communication Skills, Octopus, Power Bi, Data Engineering

Industry

Information Technology/IT

Description

OVERVIEW

We are seeking a highly skilled and proactive Senior Data Engineer to design, build, and optimize data pipelines and cloud-based data solutions. The ideal candidate will have hands-on expertise with Snowflake, AWS (API Gateway, Lambda, IAM, S3, SNS, SQS), DBT, Terraform, and Power BI, with strong experience in SQL and Python for data engineering tasks.

REQUIREMENTS

  • Bachelor’s Degree in Computer Science, Information Systems, or related field.
  • 5+ years of hands-on technical experience in data engineering or software engineering.
  • 3+ years building/supporting SQL-based data pipelines (Snowflake, MSSQL, or similar).
  • 3+ years of cloud experience (minimum 1+ years with AWS).
  • 2+ years of experience with Power BI (DAX, data modeling, report optimization).
  • 2+ years of programming experience with Python for data engineering tasks.
  • Experience with CI/CD tools (CDK, Terraform, GitHub Actions, or similar).
  • Knowledge of data governance frameworks is a plus.
  • Experience with DBT, Terraform, and Octopus is an added advantage.

PREFERRED SKILLS

  • Strong problem-solving and analytical skills.
  • Excellent communication skills to interact with technical and non-technical stakeholders.
  • Experience working in capital markets or similar industries is a plus.
    Job Type: Full-time
    Pay: $50.00-$53.00 per hour

Education:

  • Master’s Degree (required)

Experience:

  • Snowflake: 6 years (required)
  • Data Engineer: 10 years (required)
  • AWS services such as Lambda, Redshift, DMS: 8 years (required)
  • Power BI: 6 years (required)

How To Apply:

Incase you would like to apply to this job directly from the source, please click here

Responsibilities
  • Design, develop, and maintain scalable ETL/data pipelines using AWS services (Lambda, API Gateway, S3, etc.).
  • Manage and optimize Snowflake tables, DDL/DML queries, and DBT models.
  • Write, modify, and optimize complex SQL queries for data ingestion, transformation, and reporting.
  • Integrate data from multiple sources (on-premise and SaaS applications) into cloud data warehouses.
  • Develop and enhance data visualizations and dashboards in Power BI, including DAX and performance tuning.
  • Implement CI/CD practices using Terraform, GitHub Actions, or similar tools.
  • Utilize Python for data processing, large dataset manipulations, and API-based ingestion.
  • Ensure data governance, security, and compliance standards are maintained.
  • Collaborate with cross-functional teams to support data-driven initiatives.
  • Continuously learn and adapt to emerging tools and technologies.
Loading...