Senior Data Engineer- Toronto

at  Exadel Inc

Toronto, ON, Canada -

Start DateExpiry DateSalaryPosted OnExperienceSkillsTelecommuteSponsor Visa
Immediate20 Dec, 2024Not Specified25 Sep, 2024N/APostgresql,Aws,Computer Science,Containerization,Sql,Github,Data Processing,Sql Server,Maven,Oracle,Kafka,Communication Skills,DockerNoNo
Add to Wishlist Apply All Jobs
Required Visa Status:
CitizenGC
US CitizenStudent Visa
H1BCPT
OPTH4 Spouse of H1B
GC Green Card
Employment Type:
Full TimePart Time
PermanentIndependent - 1099
Contract – W2C2H Independent
C2H W2Contract – Corp 2 Corp
Contract to Hire – Corp 2 Corp

Description:

WHO WE ARE AT EXADEL:

Exadel is a global software consulting and development company that partners with organizations to help them become digital leaders in their industries. We look beyond the code to understand the impact our clients want to make and help them get from ideation to development and outcomes. We accelerate the results of digital transformations through an open, collaborative approach combined with our deep experience across industries, business processes, and technologies.
Exadel Financial Services is the financial arm of our organization, specializing in banking, capital markets strategy and technology consulting. Exadel is committed to service excellence and being a great place to work.
Location: Toronto

JOB SUMMARY:

We are looking for a highly experienced Senior AWS Developer who will be responsible for designing and implementing complex Data applications using AWS services, Spring framework, Apache Camel, Kafka, and other Data products. The ideal candidate will have hands-on experience in developing enterprise-level software systems in AWS and possess excellent problem-solving skills. The team is looking to enhance our Data Mesh platform to onboard more data and make it more efficient. There is currently a framework in place, this person will be focused on the back-end development portion and will need to expand/enhance the current framework to establish more data feeds from internal data sources, SalesForce and various vendors.

LEGAL DISCLAIMERS:

  • Exadel is an Equal Opportunity Employer – Minority / Women / Disability / Veteran / Gender Identity / Sexual Orientation / Age.
  • Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
  • Please note this job description is not designed to cover or contain a comprehensive listing of activities, duties, or responsibilities that are required of the employee for this job. Duties, responsibilities, and activities may change at any time with or without notice.

Qualifications:

  • Bachelor’s degree in Computer Science, Software Engineering or a relevant work experience.
  • Experience in developing solutions in AWS.
  • Experience with containerization using Docker (i.e. OpenShift/AWS ECS fargate, etc.)
  • Extensive experience in development using Java, Spring framework and Apache Camel/Spring Batch for developing complex applications.
  • Experience in implementing real-time data processing using Kafka, AWS SNS/SQS.
  • Strong skills in SQL, including experience in database schema design, stored procedures, and SQL queries. Experience using Relational Database (i.e. SQL server, Oracle, PostgreSQL, etc.)
  • Proficiency in using DevOps/CI/CD systems such as GitHub, Maven, Azure DevOps, Ansible, etc.
  • Strong analytical and problem-solving skills.
  • Excellent written and verbal communication skills.
  • Familiar with Data Catalog and Lineage products (i.e. Informatica, Menta, etc.)
  • Familiar with Data virtualization products (i.e. Dremio, Denodo, etc.

Responsibilities:

  • Design and develop complex Java-based applications using AWS services, Spring framework, Apache Camel, Kafka, and SQL, etc.
  • Work collaboratively with cross-functional teams to identify and resolve complex software issues.
  • Write clean, well-designed, and maintainable code.
  • Develop and maintain efficient and secure database schemas, stored procedures, and SQL queries.
  • Implement real-time data processing using Kafka, AWS SNS/SQS.
  • Develop and maintain data transformation logic using Apache Camel.
  • Participate in code review and ensure code quality, performance, and security standards are met.
  • Stay up to date with the latest industry trends, technologies, and best practices related to Java development and related frameworks.

    Qualifications:

  • Bachelor’s degree in Computer Science, Software Engineering or a relevant work experience.

  • Experience in developing solutions in AWS.
  • Experience with containerization using Docker (i.e. OpenShift/AWS ECS fargate, etc.)
  • Extensive experience in development using Java, Spring framework and Apache Camel/Spring Batch for developing complex applications.
  • Experience in implementing real-time data processing using Kafka, AWS SNS/SQS.
  • Strong skills in SQL, including experience in database schema design, stored procedures, and SQL queries. Experience using Relational Database (i.e. SQL server, Oracle, PostgreSQL, etc.)
  • Proficiency in using DevOps/CI/CD systems such as GitHub, Maven, Azure DevOps, Ansible, etc.
  • Strong analytical and problem-solving skills.
  • Excellent written and verbal communication skills.
  • Familiar with Data Catalog and Lineage products (i.e. Informatica, Menta, etc.)
  • Familiar with Data virtualization products (i.e. Dremio, Denodo, etc.)

    Nice to Have:

  • SalesForce (nice to have) Integration.

  • AMPS, in-memory cache/data platform (MemSQL, Ignite, etc.)
  • Python
  • PowerBI, Elastic/ELK stack or Grafana, etc. dashboard tools
  • Other advanced AWS services/components (i.e. SageMaker, Bedrock, RedShift, etc.)


REQUIREMENT SUMMARY

Min:N/AMax:5.0 year(s)

Information Technology/IT

IT Software - System Programming

Software Engineering

Graduate

Computer science software engineering or a relevant work experience

Proficient

1

Toronto, ON, Canada