Data Integration Developer - (Hybrid - Toronto) at Capco
Toronto, ON, Canada -
Full Time


Start Date

Immediate

Expiry Date

10 Oct, 25

Salary

0.0

Posted On

10 Jul, 25

Experience

8 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Good communication skills

Industry

Information Technology/IT

Description

LET’S GET DOWN TO BUSINESS

Capco is looking for talented, innovative and creative people to join our incredible and growing Team focused on our financial services clients. We are looking for experienced talent exceptional domain expertise who can work directly with our clients on mission-critical projects.

Responsibilities

Job Summary: We are looking for a highly skilled Senior Developer with strong experience in data integration, JSON-based data mapping, Java Spring Boot, and ETL development. The ideal candidate will be responsible for building and maintaining scalable data transformation pipelines, translating complex data models across systems, and integrating them with enterprise-grade applications and APIs.
Key Responsibilities: Analyze business and technical data requirements to design and implement data mapping logic in JSON format.
Develop and maintain backend services using Java 8+/11 and Spring Boot frameworks.
Design and implement robust and reusable ETL pipelines to support batch and real-time data flows.
Perform data model analysis, transformation, and normalization across relational, semi-structured, and nested data formats.
Develop and maintain documentation for data mapping logic and transformation flows.
Collaborate with data architects, analysts, and QA teams to ensure integrity and quality of data.
Work on integrating APIs, data connectors, and third-party platforms using standard protocols (REST, SOAP, JDBC, etc.).
Debug, optimize, and enhance performance of existing data flows and backend logic.
Required Skills & Experience: 8+ years of hands-on experience in Java development, with deep understanding of Spring Boot ecosystem.
Strong proficiency in working with JSON, XML, and other structured formats for data exchange.
Expertise in data mapping, data transformation, and schema alignment.
Proven experience in ETL tools and frameworks (e.g., custom Java-based ETL).
Hands-on with REST API design, integration, and documentation using Swagger/OpenAPI.
Familiarity with data modeling, schema conversion, and entity relationships across systems.
Version control using Git, CI/CD workflows, and familiarity with Agile/Scrum methodologies.
Nice to Have: Experience with cloud-native platforms (AWS).
Knowledge of message brokers (e.g., Kafka).
Exposure to data governance, metadata management, or data quality tools.
Experience with data cataloging and lineage tools.

Loading...