ETL Software Developer

at  Cosmetic Physician Partners

Montréal, QC, Canada -

Start DateExpiry DateSalaryPosted OnExperienceSkillsTelecommuteSponsor Visa
Immediate25 Jan, 2025Not Specified25 Oct, 2024N/AScripting Languages,Talend,Aws,Google Cloud,Computer Science,Information Technology,Relational Databases,Postgresql,Integration,Snowflake,Data Engineering,Json,Analytical Skills,Data Modeling,Etl Tools,Docker,Javascript,Python,Azure,GitNoNo
Add to Wishlist Apply All Jobs
Required Visa Status:
CitizenGC
US CitizenStudent Visa
H1BCPT
OPTH4 Spouse of H1B
GC Green Card
Employment Type:
Full TimePart Time
PermanentIndependent - 1099
Contract – W2C2H Independent
C2H W2Contract – Corp 2 Corp
Contract to Hire – Corp 2 Corp

Description:

Are you passionate about building cutting-edge solutions and want to be part of something big from the ground up?
We’re Tovanah Analytics, a data analytics SaaS provider, transforming the way healthcare organizations use data with our innovative platform. Our mission is to empower clinics with advanced analytics that drive better decision-making.

JOB SUMMARY

We are seeking an experienced ETL Software Developer to join our data engineering team.
As one of our first software developer hires, you’ll play a pivotal role in shaping our technology and building the foundation for scalable, impactful products. This is an exciting opportunity to work directly with the founding team, make critical technical decisions, and leave your mark on a product that will change the healthcare landscape
Reporting to our CTO, the successful candidate will design, develop, and maintain efficient ETL (Extract, Transform, Load) pipelines to support our data infrastructure and will work closely with Tovanah’s founders to ensure data is accurately extracted from various sources, loaded into our data warehouse, and transformed for analytical and reporting purposes. This role requires a strong understanding of database design and the ability to work with large datasets in a fast-paced environment.
As a key player in our data ecosystem, the ETL Developer will optimize data workflows, troubleshoot issues, and ensure the overall reliability and scalability of our data pipelines.
If you’re ready to take on a challenge and create something truly meaningful, we’d love to hear from you!

QUALIFICATIONS:

  • Education: Bachelor’s degree in Computer Science, Data Engineering, Information Technology, or a related field.
  • Experience:
  • 3+ years of experience in ETL development, data engineering, or similar roles.
  • Hands-on experience with ETL tools such as dbt (Data Build Tool), Airflow, Talend, or Informatica.
  • Proficiency in working with cloud-based data warehouses (e.g., Snowflake, Redshift, BigQuery).
  • Strong SQL skills and experience with relational databases (PostgreSQL)
  • Experience with scripting languages like JavaScript or Python.
  • Experience with integrating with APIs and JSON or CSV formats.
  • Familiarity with version control systems like Git.

SKILLS:

  • Solid understanding of data modeling, database architecture, and data warehousing concepts.
  • Strong analytical skills with the ability to troubleshoot and resolve data-related issues.
  • Knowledge of cloud platforms such as Google Cloud, AWS, Azure.
  • Ability to manage multiple projects and prioritize tasks in a fast-paced environment.
  • Excellent communication and collaboration skills.

PREFERRED QUALIFICATIONS:

  • Experience with HTTP REST API development.
  • Familiarity with containerization and deployment tools like Docker
  • Knowledge of machine learning workflows and integration with data pipelines.
    Work Environment: Hybrid preferred

Responsibilities:

  • Design and Build ETL Pipelines: Develop scalable and efficient ETL pipelines to extract data from various sources, load it into the data warehouse, and transform it into usable formats for analysis.
  • Develop Data Integrations: Build integrations to authenticate, extract and download data from multiple sources, including APIs, databases, and third-party platforms.
  • Optimize Data Workflows: Continuously monitor, tune, and optimize data flows to improve performance and reduce resource consumption.
  • Data Quality and Governance: Ensure data accuracy, integrity, and consistency across different stages of the pipeline. Implement and maintain data quality checks.
  • Documentation: Write comprehensive documentation for ETL processes, including pipeline design, data models, and troubleshooting guides.
  • Troubleshooting and Debugging: Diagnose and resolve issues in production pipelines in a timely manner to minimize downtime and ensure data availability.
  • Automation: Automate repetitive tasks and data workflows using scripting or scheduling tools.
  • Collaboration: Work closely with data engineers, architects, and business stakeholders to understand data requirements and ensure the ETL processes meet business needs.


REQUIREMENT SUMMARY

Min:N/AMax:5.0 year(s)

Information Technology/IT

IT Software - DBA / Datawarehousing

Software Engineering

Graduate

Computer science data engineering information technology or a related field

Proficient

1

Montréal, QC, Canada