Principal Engineer at eSimplicity
Silver Spring, Maryland, USA -
Full Time


Start Date

Immediate

Expiry Date

12 Sep, 25

Salary

168000.0

Posted On

13 Jun, 25

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Software, Ecs, Cms, Project Documentation, Technical Standards, Aws, Programming Languages, Hive, Large Scale Systems, Javascript, Data Systems, Snowflake, Federal Government, Apache Spark, Oauth, Active Directory, Code, Communication Skills, Edm, Hadoop, Presentations

Industry

Information Technology/IT

Description

Description:
About Us:
eSimplicity is modern digital services company that work across government, partnering with our clients to improve the lives and ensure the security of all Americans—from soldiers and veteran to kids and the elderly, and defend national interests on the battlefield. Our engineers, designers and strategist cut through complexity to create intuitive products and services that equip Federal agencies with solutions to courageously transform today for a better tomorrow for all Americans.

POSITION OVERVIEW:

We are seeking a hands-on Engineering Lead with 7+ years of experience to lead technical efforts in connecting or potentially migrating data between two large-scale, petabyte-level data systems. This role requires a deep understanding of modern data architecture, cloud-native platforms, and scalable data processing. The ideal candidate will have strong leadership skills, a collaborative mindset, and the ability to guide cross-functional teams through complex technical challenges involving disparate infrastructure environments.

REQUIRED QUALIFICATIONS:

  • All candidates must pass public trust clearance through the U.S. Federal Government. This requires candidates to either be U.S. citizens or pass clearance through the Foreign National Government System which will require that candidates have lived within the United States for at least 3 out of the previous 5 years, have a valid and non-expired passport from their country of birth and appropriate VISA/work permit documentation.
  • 7+ years of experience in software or data engineering, including experience leading technical teams or project tracks.
  • 7+ years software engineering experience across object-oriented-programming languages (preferred Python, JavaScript)
  • Strong hands-on expertise with cloud-based data platforms, including AWS, Apache Spark, Databricks, and related tools.
  • Proven ability to rapidly understand and assimilate complex, large-scale systems.
  • Proven expertise in designing large-scale system architectures.
  • Proficient in AWS cloud infrastructure and services (VPC, DNS, Route53, Peering, RDS, S3, IAM, EKS, ECS,…).
  • Experience designing and managing automated data pipelines, especially in environments with large, complex datasets.
  • Experience developing event-driven architectures and secure RESTful API Services.
  • Proficiency with Terraform, CloudFormation, or similar infrastructure-as-code tools.
  • Familiarity with DevOps/DevSecOps pipelines and secure deployment practices.
  • Strong problem-solving skills with the ability to work across engineering and stakeholder groups.
  • Excellent communication skills and the ability to clearly convey technical decisions and risks.
  • Ability to develop comprehensive project documentation, such as playbooks, how-to docs, design docs, presentations.
  • Knowledge of AuthN and AuthZ systems, including Active Directory, Okta, OAuth, SAML

DESIRED QUALIFICATIONS:

  • CMS and Healthcare Expertise: In-depth knowledge of CMS regulations and experience with complex healthcare projects; in particular, data infrastructure related projects or similar.
  • Demonstrated success providing support within the CMS OIT environment, ensuring alignment with organizational goals and technical standards.
  • Demonstrated experience and familiarity with CMS OIT data systems (e.g. IDR-C, CCW, EDM)
  • Familiarity with legacy and modern data systems such as Snowflake, Redshift, Hadoop, or Hive.
  • Familiarity with data mesh principles and domain-oriented architectures.
  • Understanding of federal security standards, compliance frameworks, and cloud governance.

How To Apply:

Incase you would like to apply to this job directly from the source, please click here

Responsibilities
  • Lead engineering efforts to design and implement integration or migration solutions across petabyte-scale data platforms.
  • Provide technical direction and mentoring to a multidisciplinary team of data engineers, DevOps, and cloud professionals.
  • Collaborate with stakeholders to define integration strategies, assess migration feasibility, and align with broader data and cloud initiatives.
  • Oversee development and automation of secure, scalable data pipelines across evolving or heterogeneous environments.
  • Promote best practices in infrastructure-as-code, DevSecOps, and cloud-based data architecture (with a focus on AWS and Databricks).
  • Enforce best practices and coding standards that promote code reusability, maintainability, and performance.
  • Troubleshoot and resolve technical challenges involving data movement, transformation, security, and infrastructure reliability.
  • Support delivery planning, including sprint coordination, roadmap alignment, and milestone tracking.
  • Communicate technical progress, risks, and recommendations to senior leadership and project stakeholders.
    Requirements:
Loading...