Senior Software Engineer - Data Platform & AI at CanCap Management Inc
Mississauga, ON L4W 5L6, Canada -
Full Time


Start Date

Immediate

Expiry Date

30 Oct, 25

Salary

0.0

Posted On

30 Jul, 25

Experience

0 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Docker, Kubernetes, Health, Fintech, Platform Development, Systems Engineering

Industry

Information Technology/IT

Description

CanCap Group Inc. is part of privately-owned Canadian national financial services company with multiple verticals across automotive, consumer, and merchant lending portfolios. We manage the entire lifecycle of the finance receivable from credit adjudication through to contract administration, customer service, default management and post charge-off recoveries. We are a company of innovators, we learn from each other, respect each other and create together. When it comes to our customers, partners, and each other, we are always motivated by doing the “right thing”. We are always looking to find the best people and the right methods that allow us to meet this goal and look to the future for growth.

PREFERRED QUALIFICATIONS

  • Programming Languages certification.
  • GCP certification (e.g., Data Engineer, Cloud Developer).
  • Certifications in Databricks (e.g., Databricks Certified Data Engineer).
  • Experience with multi-cloud environments.
  • Expertise in software engineering best practices and ML platform development.
  • Understanding of networking concepts (VPC, firewall rules, peering, etc.) in GCP.
  • Experience in FinTech, HealthTech, or regulated industries.
  • Exposure to Kubernetes, Docker, and distributed systems engineering.

How To Apply:

Incase you would like to apply to this job directly from the source, please click here

Responsibilities
  • Design and develop scalable, secure, and high-performance data platforms using technologies like Databricks, Apache Spark, Kafka, Delta Lake, and cloud-native services (GCP/AWS/Azure).
  • Develop open-source technologies based backend systems to support Data Platform and AI needs.
  • Implement streaming and batch processing architectures to support analytics and AI workloads.
  • Design, build, and optimize scalable and reliable data pipelines using Python, Databricks workflows and related ETL frameworks.
  • Develop and deploy modular AI/ML components that integrate with the data platform using frameworks like PyTorch, TensorFlow, Hugging Face Transformers, etc.
  • Work with MLOps tools (e.g., MLflow, Vertex AI, SageMaker) to manage model lifecycle, reproducibility, and deployment pipelines.
  • Integrate third-party AI services (e.g., OpenAI) into internal applications and decision engines.
  • Develop and maintain robust data models and transformations using DBT (Data Build Tool)
  • Manage and orchestrate data workflows on Google Cloud Platform (GCP) using DBT, Databricks, Dataflow, Cloud Composer, and Cloud Storage.
  • Collaborate with stakeholders to understand data needs and provide clean, structured data for analytics and reporting.
  • Implement data quality, validation and governance practices to ensure trust in the data platform.
  • Mentor junior engineers and participate in design and code reviews.
  • Monitor and troubleshoot data pipeline issues, ensuring performance and reliability.
  • Troubleshoot and resolve Databricks platform issues, identifying root causes of system performance bottlenecks.
Loading...