Data Engineer

at  Tarkett

Montréal, QC, Canada -

Start DateExpiry DateSalaryPosted OnExperienceSkillsTelecommuteSponsor Visa
Immediate30 Dec, 2024Not Specified03 Oct, 2024N/AData Warehousing,Boomi,Aws,Modeling,Agile Methodologies,Jira,Data Transformation,Pipeline Development,Soft Skills,Json,Sql,Xml,Visio,Dbt,Azure,Data Integration,Management Software,SnowflakeNoNo
Add to Wishlist Apply All Jobs
Required Visa Status:
CitizenGC
US CitizenStudent Visa
H1BCPT
OPTH4 Spouse of H1B
GC Green Card
Employment Type:
Full TimePart Time
PermanentIndependent - 1099
Contract – W2C2H Independent
C2H W2Contract – Corp 2 Corp
Contract to Hire – Corp 2 Corp

Description:

Job Description:
We are seeking a skilled Data Engineer with deep experience in data integration platforms and cloud technologies. The ideal candidate will be responsible for building, managing, and optimizing data pipelines, ensuring smooth data integration across systems, and providing high-quality data transformations. Expertise in Boomi, IICS, Snowflake and dbt is highly desirable for this role. You will collaborate with cross-functional teams to ensure seamless data flow and reporting accuracy.

Key Responsibilities:

  • Design and Develop Data Pipelines: Build and manage scalable and cost-efficient data pipelines to integrate data from various internal and external sources into the cloud data warehouse (Snowflake) using Boomi, IICS, and dbt.
  • Data Integration: Leverage Boomi and Informatica to design and maintain integrations between cloud and on-premise systems, ensuring data quality and integrity.
  • Data Transformation: Implement data transformations using dbt and other ETL processes to prepare data for analysis, reporting, and machine learning purposes.
  • Optimize Data Workflows: Collaborate with Data Analysts, Data Scientists, and business stakeholders to optimize data workflows and improve overall efficiency in data handling.
  • Data Governance & Quality: Ensure high standards of data quality, governance, and security throughout the lifecycle of data.
  • Troubleshooting and Maintenance: Actively monitor, troubleshoot, and resolve data pipeline issues in real-time to minimize disruptions.
  • Documentation: Create and maintain comprehensive documentation for all data integration and pipeline processes.

Required Qualifications:

Experience:

  • 4+ years of experience as a Data Engineer, with a focus on cloud data integration and pipeline development.
  • Strong experience in Boomi (or Informatica IICS), with hands-on experience building integrations and workflows.
  • Hands on Experience working with API’s
  • Proficiency in Snowflake for cloud-based data warehousing.
  • Strong expertise in using dbt for data transformation and modeling.

Certifications:

At least 1 certification in either Boomi or Informatica is highly preferred, applicable certifications below.

  • Informatica IICS
  • Cloud Data Integration Developer
  • Cloud Application Integration Developer
  • Boomi Platform
  • Boomi Professional Developer
  • Boomi Integration Developer

Technical Skills:

  • Proficiency in SQL for querying and transforming data.
  • Strong understanding of ETL/ELT concepts and tools.
  • Familiarity with REST APIs, JSON, XML, and other data exchange formats.
  • Experience working with version control systems (e.g., Git).

Soft Skills:

  • Strong problem-solving skills and attention to detail.
  • Excellent communication and collaboration skills with both technical and non-technical stakeholders.
  • Ability to actively listen and capture requirements from stakeholders
  • Ability to work in an agile and fast-paced environment and aware of agile methodologies.

Preferred Qualifications and Knowledge:

  • Familiarity with other cloud platforms such as Azure or AWS.
  • Experience with CI/CD pipelines for deploying and managing data workflows.
  • Knowledge of data governance frameworks and best practices.
  • Knowledge and Proficiency utilizing Visualization Tools. Examples: Visio, Lucid, Chart.io, Miro etc.
  • Proficiency with Task management software like Jira is a plus.
  • Knowledge of Data Architectures and best practices for building out data pipelines.

What We Offer

  • A commitment that Safety is #1
  • Competitive benefits, pay, and retirement plan options!
  • Career growth, stability, and flexible work arrangements.

Responsible Manufacturing – Protecting Our Planet for the Future

  • We utilize renewable energy and a closed loop recycled water process.
  • We are committed to reducing greenhouse emissions and water consumption.
  • We are the only flooring company recognized by the Asthma and Allergy foundation.

Who we are:
With a history of 140 years, Tarkett is a worldwide leader in innovative flooring and sport surface solutions with 12,000 employees and 34 industrial sites. Offering a wide range of products including vinyl, linoleum, rubber, carpet, wood, laminate, artificial turf, and athletic tracks. The Group serves customers in more than 100 countries across the globe.
Committed to change the game with circular economy and to reducing its carbon footprint, the Group has implemented an eco-innovation strategy based on Cradle to Cradle® principles, fully aligned with its Tarkett Human-Conscious Design® approach.
Tarkett is listed on Euronext (Compartment B, ISIN FR0004188670, ticker: TKTT).
www.tarkett-group.com
Tarkett is an equal opportunity employer. We value diversity in backgrounds and in experiences and promote an inclusive workplace where all employees can perform at their best

Responsibilities:

  • Design and Develop Data Pipelines: Build and manage scalable and cost-efficient data pipelines to integrate data from various internal and external sources into the cloud data warehouse (Snowflake) using Boomi, IICS, and dbt.
  • Data Integration: Leverage Boomi and Informatica to design and maintain integrations between cloud and on-premise systems, ensuring data quality and integrity.
  • Data Transformation: Implement data transformations using dbt and other ETL processes to prepare data for analysis, reporting, and machine learning purposes.
  • Optimize Data Workflows: Collaborate with Data Analysts, Data Scientists, and business stakeholders to optimize data workflows and improve overall efficiency in data handling.
  • Data Governance & Quality: Ensure high standards of data quality, governance, and security throughout the lifecycle of data.
  • Troubleshooting and Maintenance: Actively monitor, troubleshoot, and resolve data pipeline issues in real-time to minimize disruptions.
  • Documentation: Create and maintain comprehensive documentation for all data integration and pipeline processes


REQUIREMENT SUMMARY

Min:N/AMax:5.0 year(s)

Information Technology/IT

IT Software - DBA / Datawarehousing

Software Engineering

Graduate

Proficient

1

Montréal, QC, Canada