Senior Data Engineer, (Global Security) at RBC Global Asset Management
Toronto, Ontario, Canada -
Full Time


Start Date

Immediate

Expiry Date

26 Feb, 26

Salary

0.0

Posted On

28 Nov, 25

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Big Data Management, Cloud Computing, Database Development, Databricks Platform, Data Engineering, Data Mining, Data Pipelines, Data Warehousing, ETL Development, Microservice Framework, Python, Quality Management, Requirements Analysis

Industry

Description
Job Description What is the opportunity? For a Senior Data Engineer to join our collaborative cyber security team, focusing on building scalable data solutions that directly enhance our application ecosystem. This role will work together with our application developers to integrate a robust data capability into our application to transform how our products leverage data for better design and business value. You’ll be instrumental in developing a comprehensive data strategy that seamlessly bridges the gaps between data and application functionality. This role is focused on python application development with Databricks experience. Working directly with Python application developers to integrate data capabilities into existing and new applications. What will you do? Design, develop, and maintain end-to-end data pipelines in Azure Databricks using Spark (SQL, PySpark). Implement and optimize ELT/ELT workflows using Databricks Workflows or Apache Airflow ensuring data integrity, quality, and reliability. Manage Delta Lake solutions for data versioning, incremental loads, and efficient application data access. Apply best practices in data governance, ensuring compliance using Unity Catalog for access management and data lineage tracking. Monitor, troubleshoot, and optimize Spark jobs for performance, addressing data pipelines bottlenecks that impact application responsiveness. Build automated monitoring, alerting, and incident managements solution to ensure data reliability, availability, and performance. Collaborate with our developers and cross-functional teams to integrate data capabilities into existing and new applications. Build APIs and data services that applications can consume for real-time and batch data processing. Develop and maintain comprehensive documentation for data pipelines, transformations and data models. Foster knowledge and collaborative functions What do you need to succeed? Databricks certifications (e.g., Databricks Certified Data Engineer, Spark Engineer). Application Development experience, proficient in Python and building microservices. Institutional knowledge working with an enterprise that embraces programming best practices Bachelor’s or master’s degree in computer science, Data Engineering and, or a related field. 5+ years of proven experience in data engineering, delivering business-critical software solutions for large enterprises with a consistent track record of success. Strong expertise in Databricks (Delta Lake, Unity Catalog, Lakehouse Architecture, Table Triggers, Delta Live Pipelines, Databricks Runtime, Cluster management etc.) Strong experience in Spark and PySpark for big data processing. Knowledge of SCM, Infrastructure-as-code, and CI/CD pipelines. Nice to Have Exposure to Kubernetes, Docker, and Terraform. Strong understanding of business intelligence and reporting tools. Familiarity with Cyber Security Concepts. What’s in it for you? We thrive on the challenge to be our best, progressive thinking to keep growing, and working together to deliver trusted advice `to help our clients thrive and communities prosper. We care about each other, reaching our potential, making a difference to our communities, and achieving success that is mutual. A comprehensive Total Rewards Program including bonuses and flexible benefits, competitive compensation, commissions, and stock where applicable Leaders who support your development through coaching and managing opportunities Work in a dynamic, collaborative, progressive, and high-performing team A world-class training program in financial services Opportunities to do challenging work Opportunities to take on progressively greater accountabilities Opportunities to building close relationships with clients. #LI-Post #LI-PK #TECHPJ Job Skills Big Data Management, Cloud Computing, Database Development, Databricks Platform, Data Engineering, Data Mining, Data Pipelines, Data Warehousing (DW), ETL Development, ETL Processing, Group Problem Solving, Microservice Framework, Microsoft Azure Databricks, Python (Programming Language), Quality Management, Requirements Analysis Additional Job Details Address: 16 YORK ST:TORONTO City: Toronto Country: Canada Work hours/week: 37.5 Employment Type: Full time Platform: TECHNOLOGY AND OPERATIONS Job Type: Regular Pay Type: Salaried Posted Date: 2025-10-03 Application Deadline: 2025-12-01 Note: Applications will be accepted until 11:59 PM on the day prior to the application deadline date above Inclusion and Equal Opportunity Employment At RBC, we believe an inclusive workplace that has diverse perspectives is core to our continued growth as one of the largest and most successful banks in the world. Maintaining a workplace where our employees feel supported to perform at their best, effectively collaborate, drive innovation, and grow professionally helps to bring our Purpose to life and create value for our clients and communities. RBC strives to deliver this through policies and programs intended to foster a workplace based on respect, belonging and opportunity for all. Join our Talent Community Stay in-the-know about great career opportunities at RBC. Sign up and get customized info on our latest jobs, career tips and Recruitment events that matter to you. Expand your limits and create a new future together at RBC. Find out how we use our passion and drive to enhance the well-being of our clients and communities at jobs.rbc.com.
Responsibilities
Design, develop, and maintain end-to-end data pipelines in Azure Databricks using Spark. Collaborate with developers and cross-functional teams to integrate data capabilities into existing and new applications.
Loading...