Data Engineer (Remote) at Govcio LLC
Remote, Oregon, USA -
Full Time


Start Date

Immediate

Expiry Date

26 Nov, 25

Salary

160000.0

Posted On

26 Aug, 25

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Spark, Leadership Skills, Data Processing, Python, Code, Data Modeling, Computer Science, It

Industry

Information Technology/IT

Description

Overview:
GovCIO is seeking a Data Engineer to support operational solutions in our multi-cloud lakehouse environments on Azure and AWS. The ideal candidate will thrive in a fast-paced operations and maintenance environment. The ideal candidate will have expertise in data engineering and analytics, with hands-on experience in Databricks, Azure Synapse Analytics, and a solid understanding of the Medallion Architecture. This customer-facing role will be instrumental in ensuring customer satisfaction.
This position will fully remote within the United States.

Responsibilities:

  • Responsible for delivery of data engineering work (ingestion, conditioning and publishing) aligned with published standards (ingestion, conditioning).
  • Data pipeline development and data integration.
  • Write QA notebook(s) for all engineering work to be checked by CX Insights QA personnel, aligned with published policy.
  • Align engineering work with published performance and monitoring of data pipeline standards.
  • Remediation of data pipelines.
  • Comply with all Data Quality and governance standards.
  • Documentation for Data Products
  • Assist in reducing technical debt related to the items listed above.
  • Coordinate with/provide feedback to architects for architectural solutions.
  • Development of Proof of Concepts
  • Documentation within the ticket for short-term work delivery adjustments including quality notebooks.
  • Quickly Respond to Feature Requests from customers.
  • Troubleshoot, resolve, properly capture in systems of record customer reported bugs.
  • Experience with the following process, tools, and platforms:
  • Azure cloud experience is a must
  • Azure API experience
  • Azure Synapse Analytics
  • Spark
  • Databricks
  • Python
  • Medallion Architecture in Data Lakehouse environment
  • ELT/ETL process
  • CI/CD pipelines
  • Collaborate with Architects, data engineers, data scientists, and business stakeholders to deliver scalable and reliable data solutions.
  • Role model the attributes of an ideal team player, fostering a culture of technical excellence and continuous learning.
  • Ability to understand the business and its objectives and be able to translate complex data concepts into terms that non-technical stakeholders can understand
  • Evaluates, sizes and selects technology components, such as software, hardware, and (limited) networking capabilities, for database management systems and application databases.
  • Develops, maintains, evaluates and documents administration of database tools and strategies.
  • Designs and implements data models and data access layers for new product functionality.
  • Writes and tunes SQL queries for performance and scalability.
  • Implements comprehensive backup and database replication solutions

Qualifications:

REQUIRED SKILLS AND EXPERIENCE:

  • Bachelor’s degree in engineering, Computer Science, Systems, Business or related scientific/technical discipline
  • 15+ years of experience in IT, with at least 5 years in cloud architecture roles.
  • 8 years of additional relevant experience may be substituted for education
  • Proven expertise in Azure (e.g., Data Lakehouse, Synapse, Data Factory).
  • Extensive hands-on experience with Databricks
  • Strong understanding and practical application of Medallion Architecture in data lakehouse environments.
  • Solid knowledge of data modeling, ETL/ELT processes, and big data technologies.
  • Proficiency in Python and Spark.
  • Experience with CI/CD pipelines, Infrastructure as Code (e.g., Terraform, ARM templates).
  • Strong communication and leadership skills.

PREFERRED SKILLS AND EXPERIENCE:

  • Certifications such as AWS Certified Solutions Architect – Professional, Microsoft Certified: Azure Solutions Architect Expert, or Databricks Certified Data Engineer.
  • Experience with real-time data processing (e.g., Kafka, Event Hubs).
  • Familiarity with data governance tools (e.g., Unity Catalog, Purview).
  • Exposure to AI/ML model deployment in cloud environments.
  • AWS experience is nice to have
    Clearance Required: Ability to obtain and maintain a Suitability/Public Trust clearance.

How To Apply:

Incase you would like to apply to this job directly from the source, please click here

Responsibilities
  • Responsible for delivery of data engineering work (ingestion, conditioning and publishing) aligned with published standards (ingestion, conditioning).
  • Data pipeline development and data integration.
  • Write QA notebook(s) for all engineering work to be checked by CX Insights QA personnel, aligned with published policy.
  • Align engineering work with published performance and monitoring of data pipeline standards.
  • Remediation of data pipelines.
  • Comply with all Data Quality and governance standards.
  • Documentation for Data Products
  • Assist in reducing technical debt related to the items listed above.
  • Coordinate with/provide feedback to architects for architectural solutions.
  • Development of Proof of Concepts
  • Documentation within the ticket for short-term work delivery adjustments including quality notebooks.
  • Quickly Respond to Feature Requests from customers.
  • Troubleshoot, resolve, properly capture in systems of record customer reported bugs.
  • Experience with the following process, tools, and platforms:
  • Azure cloud experience is a must
  • Azure API experience
  • Azure Synapse Analytics
  • Spark
  • Databricks
  • Python
  • Medallion Architecture in Data Lakehouse environment
  • ELT/ETL process
  • CI/CD pipelines
  • Collaborate with Architects, data engineers, data scientists, and business stakeholders to deliver scalable and reliable data solutions.
  • Role model the attributes of an ideal team player, fostering a culture of technical excellence and continuous learning.
  • Ability to understand the business and its objectives and be able to translate complex data concepts into terms that non-technical stakeholders can understand
  • Evaluates, sizes and selects technology components, such as software, hardware, and (limited) networking capabilities, for database management systems and application databases.
  • Develops, maintains, evaluates and documents administration of database tools and strategies.
  • Designs and implements data models and data access layers for new product functionality.
  • Writes and tunes SQL queries for performance and scalability.
  • Implements comprehensive backup and database replication solution
Loading...