Lead Data Engineer (Snowflake, DBT, Pandas)- R01554246 at Brillio
St. Louis, Missouri, USA -
Full Time


Start Date

Immediate

Expiry Date

26 Nov, 25

Salary

80.0

Posted On

26 Aug, 25

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Snowflake, Process Consulting, Data Quality, It, Payer, Python, Dbt, Communication Skills, Data Exchange, Validation, Batch Processing, Salesforce, Documentation, Data Models, Care Plans, Pandas, Hipaa, Design, Provider Networks, Data Engineering, Pii, Eligibility, Analytics

Industry

Information Technology/IT

Description

ABOUT BRILLIO:

Brillio is one of the fastest growing digital technology service providers and a partner of choice for many Fortune 1000 companies seeking to turn disruption into a competitive advantage through innovative digital adoption. Brillio, renowned for its world-class professionals, referred to as “Brillians”, distinguishes itself through their capacity to seamlessly integrate cutting-edge digital and design thinking skills with an unwavering dedication to client satisfaction.
Brillio takes pride in its status as an employer of choice, consistently attracting the most exceptional and talented individuals due to its unwavering emphasis on contemporary, groundbreaking technologies, and exclusive digital projects. Brillio’s relentless commitment to providing an exceptional experience to its Brillians and nurturing their full potential consistently garners them the Great Place to Work® certification year after year.

CONSULTANTPRIMARY SKILLS

  • Snowflake, DBT, Pandas. Salesforce Health Cloud-Health Cloud Setup, Salesforce Health Cloud-Care Program (Patient Services), Salesforce Health Cloud-Care Management, Salesforce Health Cloud-Providers

JOB REQUIREMENTS

Job Title: Senior Data Engineer –Snowflake to Health Cloud Integration
We are seeking a seasoned Senior Data Engineer to lead the integration of Snowflake with Salesforce Health Cloud enabling scalable, real-time data exchange and analytics across healthcare platforms. This role is pivotal in building robust data pipelines, modeling healthcare data, and ensuring seamless ETL processes that support clinical, operational, and reporting needs.

Key Responsibilities

  • Architect and implement ETL/ELT pipelines to ingest and transform data from Snowflake to SFDC Health Cloud.
  • Design and maintain data models that support payer and provider analytics, including Portico, Excelys, and Amisys structures
  • Leverage DBT for transformation workflows and documentation of data lineage.
  • Design and implement Snowflake clusters using Terraform
  • Use Pandas and Python for data wrangling, preprocessing, and validation.
  • Collaborate with cross-functional teams to align integration efforts with business goals, including real-time sync and batch processing
  • Ensure data quality, governance, and compliance with healthcare regulations.
  • Monitor pipeline performance and troubleshoot issues across environments.
  • Contribute to platform scalability and observability using tools like MuleSoft, Informatica, and Salesforce APIs

Required Qualifications

  • 5+ years of experience in data engineering, preferably in healthcare or CRM integrations.
  • Hands-on experience with Snowflake, DBT, and Pandas.
  • Strong SQL and Python skills.
  • Experience with Salesforce Health Cloud data structures and APIs.
  • Familiarity with the Health Care Industry, HIPAA and PII data management
  • Familiarity with integration platforms such as MuleSoft, Informatica, or custom Snowflake connectors
  • Experience with Master Data Management solutions and Data Management concepts. Entity resolution, splits, merges, data lineage
  • Understanding of healthcare data domains including eligibility, claims, care plans, and provider networks
  • Excellent problem-solving and communication skills.

Preferred Qualifications

  • Deep SQL, stored procedure and materialized view experience.
  • Knowledge of real-time data sync, identity resolution, and multi-object updates in Salesforce
  • Background in business process consulting or enterprise data strategy.
  • AS#01

Know what it’s like to work and grow at Brillio: Click here

Responsibilities
  • Architect and implement ETL/ELT pipelines to ingest and transform data from Snowflake to SFDC Health Cloud.
  • Design and maintain data models that support payer and provider analytics, including Portico, Excelys, and Amisys structures
  • Leverage DBT for transformation workflows and documentation of data lineage.
  • Design and implement Snowflake clusters using Terraform
  • Use Pandas and Python for data wrangling, preprocessing, and validation.
  • Collaborate with cross-functional teams to align integration efforts with business goals, including real-time sync and batch processing
  • Ensure data quality, governance, and compliance with healthcare regulations.
  • Monitor pipeline performance and troubleshoot issues across environments.
  • Contribute to platform scalability and observability using tools like MuleSoft, Informatica, and Salesforce API
Loading...