Description:
About Us:
eSimplicity is modern digital services company that work across government, partnering with our clients to improve the lives and ensure the security of all Americans—from soldiers and veteran to kids and the elderly, and defend national interests on the battlefield. Our engineers, designers and strategist cut through complexity to create intuitive products and services that equip Federal agencies with solutions to courageously transform today for a better tomorrow for all Americans.
POSITION OVERVIEW:
We are seeking a hands-on Engineering Lead with 7+ years of experience to lead technical efforts in connecting or potentially migrating data between two large-scale, petabyte-level data systems. This role requires a deep understanding of modern data architecture, cloud-native platforms, and scalable data processing. The ideal candidate will have strong leadership skills, a collaborative mindset, and the ability to guide cross-functional teams through complex technical challenges involving disparate infrastructure environments.
REQUIRED QUALIFICATIONS:
- All candidates must pass public trust clearance through the U.S. Federal Government. This requires candidates to either be U.S. citizens or pass clearance through the Foreign National Government System which will require that candidates have lived within the United States for at least 3 out of the previous 5 years, have a valid and non-expired passport from their country of birth and appropriate VISA/work permit documentation.
- 7+ years of experience in software or data engineering, including experience leading technical teams or project tracks.
- 7+ years software engineering experience across object-oriented-programming languages (preferred Python, JavaScript)
- Strong hands-on expertise with cloud-based data platforms, including AWS, Apache Spark, Databricks, and related tools.
- Proven ability to rapidly understand and assimilate complex, large-scale systems.
- Proven expertise in designing large-scale system architectures.
- Proficient in AWS cloud infrastructure and services (VPC, DNS, Route53, Peering, RDS, S3, IAM, EKS, ECS,…).
- Experience designing and managing automated data pipelines, especially in environments with large, complex datasets.
- Experience developing event-driven architectures and secure RESTful API Services.
- Proficiency with Terraform, CloudFormation, or similar infrastructure-as-code tools.
- Familiarity with DevOps/DevSecOps pipelines and secure deployment practices.
- Strong problem-solving skills with the ability to work across engineering and stakeholder groups.
- Excellent communication skills and the ability to clearly convey technical decisions and risks.
- Ability to develop comprehensive project documentation, such as playbooks, how-to docs, design docs, presentations.
- Knowledge of AuthN and AuthZ systems, including Active Directory, Okta, OAuth, SAML
DESIRED QUALIFICATIONS:
- CMS and Healthcare Expertise: In-depth knowledge of CMS regulations and experience with complex healthcare projects; in particular, data infrastructure related projects or similar.
- Demonstrated success providing support within the CMS OIT environment, ensuring alignment with organizational goals and technical standards.
- Demonstrated experience and familiarity with CMS OIT data systems (e.g. IDR-C, CCW, EDM)
- Familiarity with legacy and modern data systems such as Snowflake, Redshift, Hadoop, or Hive.
- Familiarity with data mesh principles and domain-oriented architectures.
- Understanding of federal security standards, compliance frameworks, and cloud governance.
Incase you would like to apply to this job directly from the source, please click here