Medicaid Data Engineer at OMTECH LLC
Minneapolis, MN 55435, USA -
Full Time


Start Date

Immediate

Expiry Date

22 Oct, 25

Salary

60.0

Posted On

22 Jul, 25

Experience

0 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Pipeline Development, Data Transformation, Postgresql, Data Modeling, Communication Skills

Industry

Information Technology/IT

Description

Medicaid Data Engineer
Hybrid - Minneapolis, MN
6 months Contract to Hire
Only local candidates
Should be able to convert to full-time without sponsorship

PREFERRED EXPERIENCE & SKILLS

  • Hands-on experience with Terraform for infrastructure provisioning and CI/CD automation.
  • Strong background in ETL/ELT pipeline development and data transformation.
  • Solid experience with API integrations for ingesting external datasets.
  • Proficiency in CosmosDB (or equivalent NoSQL technologies) and PostgreSQL.
  • Familiarity with data warehousing concepts and dimensional data modeling.
  • Prior experience working with provider or healthcare datasets is highly preferred.
  • Proven ability to work independently and navigate ambiguity in a complex data environment.
  • Excellent problem-solving abilities and a proactive, self-starter mindset.
  • Strong communication skills and the ability to collaborate with technical and non-technical stakeholders.
    Job Types: Full-time, Contract
    Pay: $50.00 - $60.00 per hour

Experience:

  • Healthcare Data Management: 3 years (Preferred)

Ability to Commute:

  • Minneapolis, MN 55435 (Required)

Work Location: Hybrid remote in Minneapolis, MN 5543

Responsibilities
  • Design and develop ETL/ELT pipelines for ingesting and transforming provider, clinic, and hospital data from multiple sources.
  • Build and maintain scalable data infrastructure using Terraform for infrastructure as code and CI/CD deployments.
  • Integrate and consume RESTful APIs for real-time and batch data ingestion.
  • Optimize and manage CosmosDB (NoSQL) and PostgreSQL (relational) data storage solutions.
  • Analyze and enhance existing data flows and propose improvements for performance and scalability.
  • Collaborate with cross-functional teams, including product, data science, and DevOps, to align data initiatives with business goals.
  • Contribute to data modeling, warehousing strategies, and metadata management.
Loading...