Data Engineer at Encora
Remote, Oregon, USA -
Full Time


Start Date

Immediate

Expiry Date

05 Aug, 25

Salary

0.0

Posted On

05 May, 25

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Java, Aws, Performance Tuning, Version Control Tools, Azure, Scala, Data Modeling, Dbt, Sql

Industry

Information Technology/IT

Description

Encora is looking for a skilled and motivated Data Engineer with 2–5 years of experience to join our growing data team. You will play a key role in designing, building, and maintaining scalable data pipelines and infrastructure that empower analytics, machine learning, and business intelligence. The ideal candidate is passionate about data, has a solid understanding of modern data engineering practices, and is comfortable working in a fast-paced environment.
This is a 6 month project with high likelihood of extension, working 100% remote supporting EST work hours.

REQUIRED QUALIFICATIONS:

  • 2–5 years of hands-on experience as a Data Engineer or similar role.
  • Proficiency in SQL and at least one programming language (e.g., Python, Java, or Scala).
  • Experience with ETL/ELT tools and orchestration frameworks (e.g., Airflow, dbt, Luigi).
  • Solid understanding of data modeling, warehousing concepts, and performance tuning.
  • Experience with cloud platforms such as AWS, GCP, or Azure.
  • Familiarity with version control tools like Git and CI/CD practices.
  • Strong problem-solving skills and attention to detail.
Responsibilities
  • Design, develop, and maintain robust ETL/ELT pipelines using tools like Apache Airflow, dbt, or similar.
  • Work closely with data analysts, scientists, and business stakeholders to understand data requirements and translate them into scalable solutions.
  • Optimize and manage data storage and data lake/warehouse solutions (e.g., Snowflake, BigQuery, Redshift, or Azure Synapse).
  • Ensure data quality, integrity, and compliance with data governance and security policies.
  • Monitor and troubleshoot data workflows to ensure high availability and performance.
  • Contribute to the architecture and development of a modern data platform using cloud technologies (AWS, GCP, or Azure).
  • Document data models, pipelines, and processes for internal use and training.
Loading...