Data Engineer at Connsci
Gaithersburg, MD 20878, USA -
Full Time


Start Date

Immediate

Expiry Date

27 Nov, 25

Salary

145000.0

Posted On

27 Aug, 25

Experience

2 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Transformation, Etl, Sql, Data Engineering, Cloud Services, Computer Science, Data Modeling, Data Governance, Florida, Government

Industry

Information Technology/IT

Description

Connsci is seeking a skilled Data Engineer to support the building of an anomaly detection platform for a large government customer. This role will design, build, and manage data pipelines that integrate financial and procurement systems. The Data Engineer will ensure that data ingestion, transformation, and storage processes are reliable, secure, and compliant with state and federal standards.

BASIC QUALIFICATIONS:

  • Bachelor’s degree in Computer Science, Engineering, or related field (or equivalent experience).
  • At least 5 years of experience in data engineering or ETL development
  • At least 2 years of experience with Azure cloud services to include experience with Data Factory, Databricks, ADLS, Synapse/SQL, or similar
  • At least 2 years of experience with data modeling and transformation using SQL and PySpark

PREFERRED QUALIFICATIONS:

  • 7+ years of experience in data engineering or ETL development in government or other highly regulated industries
  • Familiarity with data governance, compliance, and security frameworks (HIPAA, PII, FERPA, CJIS)
  • Experience supporting multi-environment deployments and collaborating in cross-functional teams
    Location: This role will be able to work remotely but preference will be given to candidates who are located in the same region as a Connsci office (DC Metro or Florida).

How To Apply:

Incase you would like to apply to this job directly from the source, please click here

Responsibilities
  • Design and maintain secure data ingestion pipelines using Azure Data Factory, API Management, and SFTP
  • Develop ETL/ELT workflows in Azure Databricks to cleanse, transform, and normalize data across multiple systems
  • Implement data quality checks (e.g., Great Expectations) to validate completeness, accuracy, and timeliness.
  • Manage data storage in Azure Data Lake Storage Gen2 (bronze/silver/gold architecture) with evidentiary controls (immutability, hashing, lineage).
  • Collaborate with Sr. Data Scientists to operationalize anomaly detection rules and risk scoring models.
  • Optimize pipelines for performance, scalability, and cost efficiency across DEV, TEST, POC, PROD environments.
  • Apply security best practices in alignment with NIST SP 800-53/171
  • Troubleshoot pipeline failures, monitor jobs, and ensure compliance reporting readiness.
Loading...