Cloud Data Warehouse Engineer at AppTad Technologies Pvt Ltd
Montréal, QC, Canada -
Full Time


Start Date

Immediate

Expiry Date

15 Nov, 25

Salary

900000.0

Posted On

15 Aug, 25

Experience

3 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Computer Science, Dbt, Numpy, Pandas, Data Analysis, Data Solutions, Python, Cloud, Stored Procedures, Data Models, Data Warehouse Architecture, Communication Skills, Clarity, Data Warehouse, Information Technology, Airflow, Snowflake

Industry

Information Technology/IT

Description

JOB DESCRIPTION:

We are looking for an experienced Cloud Data Warehouse Engineer to become a key member of our vibrant Data Warehouse team. In this role, candidate will be essential in developing our next-generation data platform, which consolidates, sources, and manages data from multiple technology systems across the organization. This platform will enable sophisticated reporting and analytics solutions, specifically designed to support the Technology Risk functions. Candidate main focus will be on building and refining our Cloud Data Warehouse using Snowflake and Python-based tools. Candidate expertise will help create reliable data models, utilizing Snowflake features such as data sharing, time travel, Snow Park, workload management, and the ingestion of both structured and unstructured data. Additionally, candidate will work on integrating Snowflake with internal platforms for purposes like data quality management, cataloging, discovery, and real-time monitoring. By collaborating closely with data engineers, analysts, ETL developers, infrastructure teams, and business stakeholders, candidate will help develop a high-performance, scalable data environment that supports advanced analytics and AI initiatives.

REQUIREMENTS:

  • Bachelor’s degree in Computer Science, Software Engineering, Information Technology or related field.
  • Minimum of 10 years’ experience in complex data environments, managing large data volumes.
  • At least 7 years of hands-on SQL/PLSQL experience with complex data analysis.
  • 5+ years of experience developing data solutions on Snowflake.
  • 3+ years of building data pipelines and warehousing solutions using Python (libraries such as Pandas, NumPy, PySpark).
  • 3+ years of experience working in hybrid data environments (On-Prem & Cloud).
  • Proven hands-on experience with Python is mandatory.
  • Extensive experience with Airflow or similar orchestration tools (e.g., Dagster).
  • Certification: Snowflake SnowPro Core is required; SnowPro Advanced Architect/Data Engineer is a plus.
  • Experience with DBT is advantageous.
  • Skill in performance tuning SQL queries, Spark jobs, and stored procedures.
  • Solid understanding of E-R data models, data warehouse architecture, and advanced modeling concepts.
  • Strong analytical capabilities to interpret complex requirements into technical solutions.
  • Excellent verbal and written communication skills.
  • Proven ability to manage multiple projects with minimal supervision, adaptable to changing priorities.
  • Strong problem-solving skills with a focus on clarity and business impact.

How To Apply:

Incase you would like to apply to this job directly from the source, please click here

Responsibilities
  • Design, develop, and manage scalable Snowflake data warehouse solutions.
  • Establish and promote best practices for efficient Snowflake usage, integrating tools like Airflow, DBT, and Spark.
  • Assist in testing and deploying data pipelines using standard frameworks and CI/CD processes.
  • Monitor, tune, and optimize query performance and data loads.
  • Support QA & UAT processes to confirm data integrity and troubleshoot issues.
  • Collaborate with cross-functional teams to ensure seamless integration of data solutions.
  • Contribute to documentation, data governance, and operational procedures to sustain system health and security.
Loading...