Data Modeler at Itero Group
Harrisburg, Pennsylvania, USA -
Full Time


Start Date

Immediate

Expiry Date

06 Dec, 25

Salary

0.0

Posted On

07 Sep, 25

Experience

0 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Data Profiling, Physical Modeling, Scalability, Data Architecture, Working Experience, Reporting Requirements, Stored Procedures, Computer Science, Information Systems, Facts, Data Systems, Erwin, Python, Sql Server, Metadata Management, Presentation Skills

Industry

Information Technology/IT

Description

ABOUT US

Itero Group is a Women-Owned Small Business focused on simplifying complex transformations. We empower clients in the private and government sectors to become more optimized, digitally enabled, and data-driven organizations through our comprehensive business consulting and innovative delivery solutions.
Itero Group’s dedicated team members are experienced thought leaders, tenacious workers, and creative thinkers. We hire people who are passionate about being catalysts for change - in our company, for our clients, throughout our career- and we empower people to express their ideas, create better practices, innovate better products, and become better professionals.
We have been named a Great Place to Work for six years, and offer a competitive salary and benefits package.

EXPERIENCE & SKILLS

  • 10+ years of experience in data architecture, data modeling, and database development.
  • Demonstrable expertise in data profiling, logical and physical modeling, and metadata management.
  • Advanced experience with data modeling tools such as Erwin or similar.
  • Proficiency in SQL Server, T-SQL, SSIS, ELT processes, stored procedures, and complex queries.
  • Strong knowledge of data warehousing concepts including star schemas, facts, dimensions, and normalization techniques.
  • Working experience with Azure Databricks, Delta Lake, Azure Synapse, and Python.
  • Experience evaluating data systems for discrepancies, performance, and scalability.
  • Familiarity with Agile/Scrum development environments and use of Azure DevOps.
  • Ability to analyze and translate business requirements into optimized technical solutions.
  • Strong communication and presentation skills for diverse audiences.
  • Ability to manage multiple concurrent projects with minimal supervision.

PREFERRED EXPERIENCE

  • Experience in public health or healthcare sectors working with health datasets and federal/state reporting requirements.
  • Familiarity with projects such as Data Modernization Initiatives, Reporting Hubs, and Unified Master Patient Index (UMPI) solutions like Verato.

EDUCATION

  • Bachelor’s degree in Computer Science, Information Systems, or a related field. Advanced study preferred.
    If you are looking for a role where you will lead with integrity, create and innovate, inspire excellence, be a respected member of the team, drive results, and have fun, we look forward to connecting with you!
Responsibilities
  • Serve as a senior-level data architect and modeler supporting EDW modernization efforts using Azure cloud technologies.
  • Design and implement logical and physical data models, supporting analytics, reporting, and operational data processing.
  • Develop and maintain data dictionaries, metadata repositories, and data governance documentation.
  • Collaborate with business analysts, developers, and DBAs to translate business requirements into optimized data models and flows.
  • Establish and enforce data modeling standards, frameworks, and best practices across teams.
  • Support the redesign and migration of the legacy EDW to Microsoft Azure, including Azure Databricks, Delta Lake, and Synapse.
  • Conduct data profiling, validation, and integrity checks to ensure data quality and accuracy.
  • Provide subject matter expertise for public health data initiatives, including projects such as PA NEDSS NextGen, PA LIMS Replacement, and COVID-19 response.
  • Perform research and recommend solutions for large-volume data processing and statistical analysis.
  • Participate in testing, documentation reviews, and QA processes for data models and systems.
  • Conduct training and knowledge transfer sessions on modeling practices, metadata, and standards.
  • Maintain documentation in SharePoint and provide weekly status updates in PeopleFluent, Daptiv (if required), and SharePoint.
  • Ensure compliance with federal and commonwealth data standards and security protocols.
Loading...