Data Engineer
at Toyota
Toronto, ON M1H 1H9, Canada -
Start Date | Expiry Date | Salary | Posted On | Experience | Skills | Telecommute | Sponsor Visa |
---|---|---|---|---|---|---|---|
Immediate | 10 Apr, 2025 | Not Specified | 21 Jan, 2025 | N/A | Data Engineering,Programming Languages,Data Modeling,Query Optimization,Shell Scripting,Writing,Cloud Services,Communication Skills,Version Control,Python,Data Vault,Data Marts,Athena,Star Schema,Debugging,Analytical Skills,Data Warehousing | No | No |
Required Visa Status:
Citizen | GC |
US Citizen | Student Visa |
H1B | CPT |
OPT | H4 Spouse of H1B |
GC Green Card |
Employment Type:
Full Time | Part Time |
Permanent | Independent - 1099 |
Contract – W2 | C2H Independent |
C2H W2 | Contract – Corp 2 Corp |
Contract to Hire – Corp 2 Corp |
Description:
Job Description
Data Engineer- 12 months contract
About Toyota Financial Services
Toyota Financial Services (TFS) provides retail, leasing and dealer financing services to Toyota and Lexus dealerships and customers across Canada. TFS is a member of Toyota Financial Services Corporation (TFSC), a wholly owned subsidiary of Toyota Motor Corporation in Japan with the Canadian operation headquarters in Markham, Ontario.
What Sets Us Apart?
At Toyota Financial Services (TFS), you will help create best in class customer/dealer experiences in an innovative, collaborative and team focused environment. TFS is an important part of the Toyota family, an award-winning global company, recognized worldwide for our technological leadership and superior standards of quality, continuous improvement, and environmental responsibility.
TFS currently has an exciting 12-month contract opportunity as a Data Engineer for its Enterprise Data Platform project. This role will support the Enterprise Data Management (EDM) development for the project team. The ideal candidate will be responsible for designing, implementing, and maintaining data pipelines and data models using ETL processes, ensuring efficient data flows within our ecosystem. This role will work with modern technologies and play a key role in enabling data-driven decision-making throughout the organization. This role will reside in our Markham Head Office.
What You’ll Be Doing:
You will work closely with Enterprise Data Platform Project team to:
- Develop, and maintain ETL pipelines to integrate data from various sources into the Enterprise Data Platform.
- Develop and optimize data models using Data Vault 2.0 and Star Schema methodologies in different data hubs.
- Work with Snowflake Database, to build Enterprise Data Platform with corporate Data Lake, ODS and Data Marts
- Write and optimize complex SQL queries for data extraction, manipulation, and reporting, ensuring optimal performance and efficiency.
- Support data governance initiatives by ensuring data quality, consistency, and security in collaboration with Data Governance team, business teams, and other stakeholders.
- Document and maintain data lineage to support transparency in data sources, transformations, and movements, working with governance teams to implement best practices.
- Leverage AWS Cloud technologies to build and maintain cloud-based data solutions, utilizing services such as S3, RDS, Redshift, Athena, DynamoDB, and more as needed.
- Work within Azure DevOps for CI/CD pipelines, code versioning, and automation, ensuring efficient and continuous integration and deployment processes.
- Collaborate with stakeholders across the organization to gather data requirements, provide insights, and support analytical needs.
- Ensure data quality, accuracy by implementing best practices for data governance, validation, and data quality & audit control.
- Monitor and troubleshoot data pipelines to maintain reliability and uptime.
What You’ll Bring:
- 5 years of hands-on experience in data engineering, ETL development, or related roles.
- Proficient in ETL processes and tools, with experience in designing, developing, and managing scalable data pipelines.
- Strong knowledge of Data Vault and Star Schema data modeling techniques for building Enterprise Data Platform and optimizing data marts.
- Experience with Snowflake Database for data warehousing, data modeling, and query optimization.
- Advanced SQL skills, capable of writing, debugging, and optimizing complex queries.
- Knowledge in using vi editor in Unix environment.
- Hands-on experience with AWS Cloud services, including (but not limited to) S3, RDS, Redshift, Athena, and DynamoDB.
- Experience with Azure DevOps tools and processes including CI/CD pipelines, version control (e.g., Git), and deployment automation.
- Understanding of data governance principles and willingness to support data governance frameworks, data quality initiatives, and data security standards in a supporting role.
- Experience with data lineage documentation and understanding the flow of data through various processes.
- Familiarity with modern data integration tools (e.g. Informatic) is a plus.
- Experience with Python, Shell scripting, or other programming languages is advantageous. Strong problem-solving and analytical skills, with the ability to diagnose and troubleshoot data-related issues.
- Excellent communication skills, both written and verbal, with the ability to work collaboratively in a team environment.
What You Should Know:
Our success begins and ends with our people. We embrace diverse perspectives and value unique human experiences. We are proud to be an equal opportunity employer that celebrates the diversity of the communities where we live and do business. Applicants for our positions are considered without regard to race, ethnicity, national origin, sex, sexual orientation, gender identity or expression, age, disability, religion, or any other characteristics protected by law.
Responsibilities:
Please refer the Job description for details
REQUIREMENT SUMMARY
Min:N/AMax:5.0 year(s)
Information Technology/IT
IT Software - DBA / Datawarehousing
Software Engineering
Graduate
Proficient
1
Toronto, ON M1H 1H9, Canada