Senior Data Integration Developer
at KPMG
Toronto, ON, Canada -
Start Date | Expiry Date | Salary | Posted On | Experience | Skills | Telecommute | Sponsor Visa |
---|---|---|---|---|---|---|---|
Immediate | 26 Apr, 2025 | Not Specified | 26 Jan, 2025 | 7 year(s) or above | Good communication skills | No | No |
Required Visa Status:
Citizen | GC |
US Citizen | Student Visa |
H1B | CPT |
OPT | H4 Spouse of H1B |
GC Green Card |
Employment Type:
Full Time | Part Time |
Permanent | Independent - 1099 |
Contract – W2 | C2H Independent |
C2H W2 | Contract – Corp 2 Corp |
Contract to Hire – Corp 2 Corp |
Description:
Overview:
At KPMG, you’ll join a team of diverse and dedicated problem solvers, connected by a common cause: turning insight into opportunity for clients and communities around the world.
The role of a Senior Data Integration Developer (Operation Team) is to deliver services and solutions for KPMG Canada internal clients through the Enterprise Analytics CoE using the suite of Microsoft Azure Data Services and data integrations.
What you will do:
- Develop and implement data integration solutions using Azure services like Azure Data Factory, Logic Apps, and Azure Synapse Analytics
- Migrate on-premises data to Azure cloud environments, ensuring data integrity and consistency
- Design, develop, and maintain ETL (Extract, Transform, Load) processes to extract data from various sources, transform it according to business requirements, and load it into data warehouses or data lakes
- Build and manage data pipelines, ensuring their smooth operation, troubleshooting issues, and optimizing performance
- Create and maintain data models for structured and unstructured data, ensuring data is accurately represented and accessible
- Work closely with business analysts, data engineers, and other stakeholders to understand data requirements and deliver appropriate solutions
- Ensure that all data integration processes comply with organizational security policies and regulatory requirements.
- Automate data integration tasks to improve efficiency and reduce manual intervention
- Document data integration processes, data flow diagrams, and architecture for future reference and maintenance.
- Continuously learn and apply new features and best practices for Azure data integration services
- Identify and resolve issues within data integration processes, ensuring high availability and reliability of data flows
- Optimize data integration workflows for better performance and scalability
What you bring to this role:
- Atleast 7 years of experience in designing, developing, and managing data pipelines within Azure Data Factory
- Familiarity with CI/CD pipelines, version control, and deployment practices using Azure DevOps
- Skills in monitoring data integration pipelines and optimizing performance to handle large datasets efficiently
- Experience with ETL tools and processes, including data extraction, transformation, and loading into data warehouses or lakes
- Understanding of data security practices, including encryption, role-based access control, and compliance with regulatory requirements
- Experience in managing a team of developers and conducting code reviews
- Ability to create and manage workflows using Azure Logic Apps and develop serverless solutions using Azure Functions
- Ability to troubleshoot and resolve issues in data integration processes efficiently
- Experience with data visualization and analytics tools like Power BI or Azure Analysis Services
- Proficiency in scripting languages like Python, PowerShell, or Scala for automation and data processing tasks
- Experience with integrating data from APIs and other web services into Azure-based solutions
- Understanding of Azure storage options like Blob Storage, Data Lake Storage, and Azure Files
- Strong knowledge of SQL, including complex queries, stored procedures, and database management with platforms like Azure SQL Database and SQL Server
- Current Microsoft Azure certifications are a plus
Responsibilities:
What you will do:
- Develop and implement data integration solutions using Azure services like Azure Data Factory, Logic Apps, and Azure Synapse Analytics
- Migrate on-premises data to Azure cloud environments, ensuring data integrity and consistency
- Design, develop, and maintain ETL (Extract, Transform, Load) processes to extract data from various sources, transform it according to business requirements, and load it into data warehouses or data lakes
- Build and manage data pipelines, ensuring their smooth operation, troubleshooting issues, and optimizing performance
- Create and maintain data models for structured and unstructured data, ensuring data is accurately represented and accessible
- Work closely with business analysts, data engineers, and other stakeholders to understand data requirements and deliver appropriate solutions
- Ensure that all data integration processes comply with organizational security policies and regulatory requirements.
- Automate data integration tasks to improve efficiency and reduce manual intervention
- Document data integration processes, data flow diagrams, and architecture for future reference and maintenance.
- Continuously learn and apply new features and best practices for Azure data integration services
- Identify and resolve issues within data integration processes, ensuring high availability and reliability of data flows
- Optimize data integration workflows for better performance and scalabilit
What you bring to this role:
- Atleast 7 years of experience in designing, developing, and managing data pipelines within Azure Data Factory
- Familiarity with CI/CD pipelines, version control, and deployment practices using Azure DevOps
- Skills in monitoring data integration pipelines and optimizing performance to handle large datasets efficiently
- Experience with ETL tools and processes, including data extraction, transformation, and loading into data warehouses or lakes
- Understanding of data security practices, including encryption, role-based access control, and compliance with regulatory requirements
- Experience in managing a team of developers and conducting code reviews
- Ability to create and manage workflows using Azure Logic Apps and develop serverless solutions using Azure Functions
- Ability to troubleshoot and resolve issues in data integration processes efficiently
- Experience with data visualization and analytics tools like Power BI or Azure Analysis Services
- Proficiency in scripting languages like Python, PowerShell, or Scala for automation and data processing tasks
- Experience with integrating data from APIs and other web services into Azure-based solutions
- Understanding of Azure storage options like Blob Storage, Data Lake Storage, and Azure Files
- Strong knowledge of SQL, including complex queries, stored procedures, and database management with platforms like Azure SQL Database and SQL Server
- Current Microsoft Azure certifications are a plu
REQUIREMENT SUMMARY
Min:7.0Max:12.0 year(s)
Information Technology/IT
IT Software - DBA / Datawarehousing
Software Engineering
Graduate
Proficient
1
Toronto, ON, Canada