Data Engineer at Nimble
United States, North Carolina, USA -
Full Time


Start Date

Immediate

Expiry Date

19 Jun, 25

Salary

0.0

Posted On

20 Mar, 25

Experience

3 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Etl, Power Bi, Predictive Analytics, Snowflake, Data Engineering, Data Security, Dax, Transformation, Data Services, Analytics, Scala, Data Processing, Rcm, Python, Sql, Data Modeling

Industry

Information Technology/IT

Description

Description:
Join a leading Revenue Cycle Management (RCM) company dedicated to transforming healthcare data into actionable insights. We leverage cutting-edge technology to streamline financial and operational processes, improving efficiency and patient outcomes. We are looking for a Data Engineer to help optimize data pipelines and build a next-generation data infrastructure incorporating technologies such as Microsoft Fabric, Azure Synapse, Databricks, and Snowflake.

JOB SUMMARY:

As a Data Engineer, you will play a crucial role in designing, building, and maintaining robust data pipelines and architectures. You will optimize data workflows, ensure scalability, and contribute to the development of a new data infrastructure that integrates with Microsoft Fabric, Azure Synapse, Databricks, Snowflake, and other cloud-based technologies. This role requires expertise in cloud-based data solutions, big data processing, and the ability to collaborate with cross-functional teams to enhance healthcare data analytics and operational efficiency.

REQUIRED SKILLS & EXPERIENCE:

  • 3+ years of experience in data engineering or a related field
  • Expertise in SQL for data processing, transformation, and performance optimization
  • Proficiency in Python or Scala for data engineering workflows
  • Strong knowledge of Azure Data Services, including Microsoft Fabric, Azure Synapse, Databricks, Snowflake, Azure Data Factory, Synapse, and Databricks
  • Experience working with large-scale data architectures in cloud environments
  • Proficiency in ETL/ELT workflows and data pipeline optimization
  • Hands-on experience with healthcare data (e.g., claims, EMR/EHR, HL7, FHIR)
  • Familiarity with data security, compliance, and governance best practices in healthcare
  • Ability to work in an agile, collaborative, and remote environment

PREFERRED SKILLS:

  • Experience with Microsoft Fabric, Azure Synapse, Databricks, and Snowflake in a production environment
  • Knowledge of other cloud-based data platforms and integration tools
  • Hands-on experience with Power BI, DAX, and data modeling
  • Experience with machine learning pipelines or predictive analytics in healthcare
  • Previous experience in RCM, insurance, or healthcare analytics
Responsibilities
  • Design, develop, and optimize scalable ETL/ELT data pipelines for healthcare RCM processes
  • Build and maintain a modern data infrastructure incorporating Microsoft Fabric, Azure Synapse, Databricks, Snowflake and other cloud technologies
  • Collaborate with data architects, analysts, and engineering teams to improve data accessibility and performance
  • Ensure data quality, security, and compliance with healthcare regulations (HIPAA, HITRUST)
  • Optimize database performance and implement best practices for data governance and metadata management
  • Work with structured and unstructured data, integrating diverse data sources such as EHR/EMR systems, claims data, and financial records
  • Implement real-time and batch data processing solutions using various cloud data platforms and tools.
  • Support data integration with BI and analytics tools such as Power BI
  • Write and optimize complex SQL queries to transform and analyze large healthcare datasets
  • Mentor junior engineers and contribute to technical best practices
    Requirements:
Loading...