INTRODUCTION
We are seeking highly skilled Senior Data Modelers to support mission-critical contracts with our Federal and DoD clients. We are building advanced systems leveraging technologies to solve complex national security challenges. We are also pioneering GenAI application development and test automation, and need innovative engineers ready to deliver secure, scalable solutions in a fast-paced environment.
JOB QUALIFICATIONS AND SKILLS
The ideal candidate will possess several of the following skills:
- Leverage financial industry expertise to define conceptual, logical and physical data models in Databricks to support new and existing business domains
- Work with product owners, system architects, data engineers, and vendors to create data models optimized for query performance, compute and storage costs
- Define best practices for the implementation of the Bronze/Silver/Gold data layers of the lakehouse
- Provide data model documentation and artifacts generated from data, data dictionary, data definitions, etc.
REQUIREMENTS AND EDUCATION / CERTIFICATIONS
- Ability to pass a National Agency Check with Inquiries (NACI) (required at time of hiring)
- At least ten or more years of experience in AI, Data Science, Software Engineering experience, including knowledge of Data ecosystem
- Bachelor’s degree in Computer Science, Information Systems, or other related field is required or related work experience
- Data Modeling: Expertise in designing and implementing data models optimized for storage, retrieval, and analytics within Databricks on AWS, including conceptual, logical, and physical data modeling
- Databricks Proficiency: In-depth knowledge and hands-on experience with AWS Databricks platform, including Databricks SQL, Runtime, clusters, notebooks, and integrations.
- ELT (Extract, Load, Transform) Processes: Proficiency in developing ETL pipelines to extract data from various sources, transform it as per business requirements, and load it into the central data lake using Databricks tools and Spark
- Data Integration: Experience integrating data from heterogeneous sources (relational databases, APIs, files) into Databricks while ensuring data quality, consistency, and lineage
- Performance Optimization: Ability to optimize data processing workflows and SQL queries in Databricks for performance, scalability, and cost-effectiveness, leveraging partitioning, clustering, caching, and Spark optimization techniques
- Data Governance and Security: Understanding of data governance principles and implementing security measures to ensure data integrity, confidentiality, and compliance within the centralized data lake environment
- Advanced SQL and Spark Skills: Proficiency in writing complex SQL queries and Spark code (Scala/Python) for data manipulation, transformation, aggregation and analysis tasks within Databricks notebooks
- Cloud Architecture: Understanding of cloud computing principles, AWS architecture, and services for designing scalable and resilient data solutions
- Data Visualization: Basic knowledge of data visualization tools (e.g. Tableau) to create insightful visualizations and dashboards for data analysis and reporting purposes
- Familiarity with government cloud deployment regulations/compliance policies such as FedRAMP, FISMA, etc.
- S. Citizenship (due to contract requirements)
- Strong communication and collaboration skills, including working in cross-functional teams
ABOUT US
We are is a mission-first technology company dedicated to serving private industry and federal agencies with cutting-edge software, cybersecurity, and cloud engineering solutions. For over 20 years, we’ve partnered with leading Fortune 500 companies and the U.S. government on programs that matter, from national defense to public safety. We bring together elite technologists, disciplined program managers, seasoned program analysts, and passionate innovators to help secure the nation’s most critical missions.
Join us if you want to work with purpose on problems that matter with a team that values agility, integrity, and impact.