DataOps Engineer
at Questrade Financial Group
Toronto, ON, Canada -
Start Date | Expiry Date | Salary | Posted On | Experience | Skills | Telecommute | Sponsor Visa |
---|---|---|---|---|---|---|---|
Immediate | 20 Jan, 2025 | Not Specified | 21 Oct, 2024 | N/A | Fintech,Airflow,Google Cloud Platform,Dignity,Kafka,Kubernetes,Openshift,Boundaries | No | No |
Required Visa Status:
Citizen | GC |
US Citizen | Student Visa |
H1B | CPT |
OPT | H4 Spouse of H1B |
GC Green Card |
Employment Type:
Full Time | Part Time |
Permanent | Independent - 1099 |
Contract – W2 | C2H Independent |
C2H W2 | Contract – Corp 2 Corp |
Contract to Hire – Corp 2 Corp |
Description:
Questrade Financial Group (QFG) of Companies is committed to helping our customers become much more financially successful and secure.
We are everything a traditional financial institution is not. At QFG, you will be constantly moving forward, bringing the future of fintech into existence. You will be a part of a collaborative team that cares deeply about our mission and each other. Your team members will help you conquer challenges, push boundaries and discover what you are truly capable of.
This is a place where you can explore, discover and learn with continuous growth. As a diverse and inclusive place to work, there are flexible working arrangements so you can unleash your creativity and curiosity with no limits. If you share the same sense of infinite possibility, come shape your future at Questrade.
Flexiti is a member of the Questrade Group of Companies (QFG), which currently includes Questrade Inc., QuestEnterprise, Questrade Wealth Management Inc., CTC, Think!nsure Ltd., and Zolo Ventures Ltd.
Welcome to Flexiti, where affordability meets growth!
We’re Canada’s top fintech lender, growing fast to make the life of Canadians more affordable. Our flexible financing options help boost sales for our retail partners. Our team of passionate individuals from around the world, believe in enjoying the journey and sparking creativity. If you want to be part of a community where your ideas matter, join us at Flexiti and let’s make waves together!
Learn more at www.flexiti.com
The perks
- Health & wellbeing resources and programs
- Paid vacation, personal, and sick days for work-life balance
- Competitive compensation and benefits packages
- Hybrid and flexible work arrangements
- Career growth and development opportunities
- Opportunities to contribute to community causes
- Work with diverse team members in an inclusive and collaborative environment
What’s it like working as a DataOps Engineer?
It’s an exciting time to join Data Engineering. We are building out new, modern Data and Analytics environments on Databricks and GCP to support key initiatives on the Questrade roadmap. This role focuses on working closely with Data Engineers in a foundational, enabling, and supporting capacity. The DataOps team is responsible end-to-end for managing our Databricks environment. In addition, Data Engineering is expanding the scope of DataOps to include MLOps. Through Questrade’s recent acquisitions, we now manage machine learning (ML) pipelines in both Databricks and GCP. Currently, the DataOps team is responsible for the ML pipelines in Databricks and will eventually take on the ML pipelines in GCP for the rest of the organization. As part of this team, you’ll help maintain and scale our Databricks platform using the latest technology, including Unity Catalog, while automating tasks, creating catalogs, and ensuring infrastructure aligns with our growing data engineering and machine learning needs.
Need more details? Keep reading…
- Coordinate and manage the deployment of workflows and pipelines sourced from GIT.
- Support Data Engineers in developing and optimizing SQL Server-based pipelines, including stored procedures, advanced T-SQL, and general pipeline development across multiple sources like BigQuery, MongoDB Atlas, REST APIs, and various file types.
- Maintain and administer the Databricks platform, utilizing the latest technologies such as Unity Catalog, including creating catalogs, schemas, volumes, and secret scopes.
- Automate administrative tasks using the Databricks REST API (knowledge of Terraform is a strong plus).
- Collaborate with infrastructure teams to implement network configurations, bucket creation, and manage Azure AD user/group integrations via SCIM.
- Maintain the PowerBI cloud platform, ensuring integration and compatibility with data pipelines. • Implement and maintain CI/CD pipelines, focusing on version control and automation best practices.
- Apply advanced MLOps knowledge, leveraging Databricks tools for model lifecycle management, deployment, and monitoring.
- Ensure monitoring and alerting with tools like Datadog (or similar systems).
- Work on implementing PCI DSS controls; experience with audits and compliance will be a
So are YOU our next DataOps Engineer? You are if you have…
- Proven experience in maintaining and automating Databricks using the REST API (Terraform knowledge is a plus) with a focus on the latest technologies and Unity Catalog.
- Strong proficiency in AWS or GCP.
- Advanced skills in Apache Spark and Delta Lake for data engineering tasks.
- Strong SQL Server expertise (temporal tables, PIVOTs, recursive queries, triggers, views, stored procedures, etc.).
- Experience with CI/CD workflows and source control (GIT).
- Familiarity with MongoDB Atlas and REST API integrations.
- Familiarity with Unix-based systems (Ubuntu).
- Knowledge of MLOps, particularly in Databricks.
- Experience maintaining the PowerBI cloud platform.
- Ability to analyze data engineering requirements, design and troubleshoot ETL processes, optimize SQL statements for performance, identify data integrity issues, and resolve bottlenecks.
PREFERRED SKILLS:
- Experience with Terraform for infrastructure automation.
- Familiarity with Kubernetes, OpenShift, or other container platforms is a plus.
- Experience with message queuing systems like Kafka.
- Familiarity with monitoring tools like Datadog or similar systems.
- Experience in PCI DSS controls and audit/compliance processes.
- Proven experience with Google Cloud Platform (GCP), particularly in BigQuery.
- Experience with Airflow or Databricks for pipeline orchestration.
Sounds like you? Click below to apply!
At Questrade Financial Group of Companies, with multiple office locations around the world, we are committed to fostering a diverse, inclusive and accessible work environment. This is an environment where individuals are treated with dignity and respect. Here, the unique skills and experience you bring will be valued. You will be supported and motivated, so that you can harness your unlimited potential. Our team reflects the diversity of the communities we serve and operate in. Having a collaborative and diverse team helps us push boundaries to bring the future of fintech into existence—not only for the benefit of our customers, but for those who build their career with us.
Candidates selected for an interview will be contacted directly. If you require accommodation during the recruitment/selection process, please let us know and we will work with you to meet your needs
Responsibilities:
Please refer the Job description for details
REQUIREMENT SUMMARY
Min:N/AMax:5.0 year(s)
Information Technology/IT
IT Software - Application Programming / Maintenance
Software Engineering
Graduate
Proficient
1
Toronto, ON, Canada