Dataform Architect at Vtrac Consulting Corporation
Toronto, ON, Canada -
Full Time


Start Date

Immediate

Expiry Date

26 Nov, 25

Salary

50.0

Posted On

27 Aug, 25

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Data Security, Mathematics, Cost Management, Gitlab, Data Governance, Collaboration, Cloud Computing, Github, Emerging Technologies, Data Quality, Git, Data Engineering, Computer Science

Industry

Information Technology/IT

Description

DESCRIPTION

We are looking for a Dataform Architect to design and optimize data pipelines on Google Cloud Platform, leveraging Dataform and BigQuery. This role involves building reusable SQLX workflows, ensuring data quality and compliance, managing performance and costs, and collaborating with stakeholders to deliver scalable data solutions while mentoring junior team members.

QUALIFICATIONS:

  • 5+ years of experience in data engineering, with a focus on SQL and cloud-based data platforms.
  • Strong experience with Dataform and GCP, including BigQuery and other related services.
  • Experience with version control systems such as Git, GitHub, or GitLab.
  • Knowledge of data security, privacy, and compliance standards.
  • Excellent communication, collaboration, and problem-solving skills.
  • Experience with performance optimization and cost management in BigQuery.
  • Familiarity with data governance and data quality best practices.
  • Ability to work effectively in a fast-paced and dynamic environment.
  • Passionate about staying current with emerging technologies and trends in data engineering and cloud computing.
  • Bachelor’s degree in Computer Science, Engineering, Mathematics, or a related field.
    We thank all candidates in advance. Only selected candidates for interviews will be contacted. For other exciting opportunities, please visit us at www.vtrac.com . VTRAC is an equal-opportunity employer.
Responsibilities
  • Design and develop data pipelines using Dataform and SQL on Google Cloud Platform (GCP).
  • Implement reusable SQLX files for data transformation workflows and queries.
  • Optimize and manage BigQuery environments for performance and cost-effectiveness.
  • Collaborate with stakeholders to identify data requirements and develop data solutions.
  • Maintain data integrity, security, and compliance through best practices and governance.
  • Design and implement data quality checks and validation processes.
  • Ensure version control and change management for data pipelines using tools such as Git.
  • Develop and maintain documentation for data pipelines, including technical specifications and architecture diagrams.
  • Stay up-to-date with industry trends and emerging technologies in data engineering and cloud computing.
  • Mentor and provide guidance to junior team members on data pipeline development and best practices.
Loading...