Velocity - Data Engineer Internship/Co-Op - Winter 2026 at Tangerine
Toronto, ON, Canada -
Full Time


Start Date

Immediate

Expiry Date

29 Sep, 25

Salary

0.0

Posted On

04 Sep, 25

Experience

0 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Spark, Artifactory, Airflow, Conferences, Extracurricular Activities, Jenkins, Aws, Python, R, Bash, Sql Server, Sqoop, Secondary Education, Azure

Industry

Banking/Mortgage

Description

Requisition ID: 234792
Join a purpose driven winning team, committed to results, in an inclusive and high-performing culture.
Velocity - Data Engineer Internship/Co-Op – Winter 2026
Term: January 2026 – April 2026
Work Hours/Week: 37.5
Application Deadline: 9/29/25

You are currently enrolled in post-secondary education.

  • You love to learn and envision yourself working for an international organization that heavily invests in your future.
  • You enjoy being involved in extracurricular activities such as conferences, clubs, and hackathons.
  • You have knowledge/experience in:
  • Big Data Technologies (Hive/Beeline, HDFS, Sqoop, Spark)
  • Cloud platforms – GCP, Azure or AWS
  • MinIO Data Storage, Airflow, Trino
  • DB2, SQL Server
  • Python (PyHive, PySpark), R, SQL-Shell (Bash, Korn) & general Linux/Unix
  • CI/CD tech stack (BitBucket, Jenkins, Artifactory)
  • This internship requires you to work 37.5 hours a week

How To Apply:

Incase you would like to apply to this job directly from the source, please click here

Responsibilities

IS THIS ROLE RIGHT FOR YOU? IN THIS ROLE, YOU WILL:

  • Perform data exploration on different data sources and model training

  • Productionalize/Operationalize advanced analytics models following the Bank standards. ModelOps plays a key role as a liaison with other teams (Security, Privacy, Data Office, Analytics, Technology, among others)

  • Automate data pipelines and trigger actions/events to streamline business processes and models enablement
  • Enable mechanisms for Data Scientist and stakeholders to monitor the performance of a Model
  • Monitor the run time of the models, work closely with Data Scientist on continuous optimization
  • Monitor data pipelines, liaise with vendors, and source systems groups in the event of failures/maintenance
  • Migrate data pipelines and models to new analytics platforms
  • Experiment & learn!

DO YOU HAVE THE SKILLS AND REQUIREMENTS THAT WILL ENABLE YOU TO SUCCEED IN THIS ROLE? – WE’D LOVE TO WORK WITH YOU IF:

  • You are currently enrolled in post-secondary education.

  • You love to learn and envision yourself working for an international organization that heavily invests in your future.

  • You enjoy being involved in extracurricular activities such as conferences, clubs, and hackathons.
  • You have knowledge/experience in:
  • Big Data Technologies (Hive/Beeline, HDFS, Sqoop, Spark)
  • Cloud platforms – GCP, Azure or AWS
  • MinIO Data Storage, Airflow, Trino
  • DB2, SQL Server
  • Python (PyHive, PySpark), R, SQL-Shell (Bash, Korn) & general Linux/Unix
  • CI/CD tech stack (BitBucket, Jenkins, Artifactory)
  • This internship requires you to work 37.5 hours a week.
Loading...