Data Engineer at solidcore
Arlington, Virginia, USA -
Full Time


Start Date

Immediate

Expiry Date

02 Jun, 25

Salary

0.0

Posted On

02 Mar, 25

Experience

3 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Scripting, Automation, Snowflake, Soft Skills, Data Structures, Troubleshooting, Python, Azure, Teams, Dbt, Communication Skills, Sql, Aws, Data Engineering, Computer Science

Industry

Information Technology/IT

Description

[solidcore] is seeking a Data Engineer to build and optimize data pipelines from our multiple source systems into Snowflake. This role will focus on designing efficient ETL/ELT processes, creating optimized SQL views for analytics, and supporting our Python-based predictive modeling initiatives. The ideal candidate will combine strong Snowflake expertise with Python development skills to enable data-driven decision making across the organization.
This role is ideal for someone who is passionate about building robust data infrastructure, enabling advanced analytics, and supporting predictive modeling efforts.
Located in Arlington, VA and reporting to the Vice President of Financial Planning & Analysis, this is an exciting opportunity to join a rapidly growing, investor-backed organization with a desire to be the leader in the studio fitness space.

REQUIREMENTS

  • Technical Skills
  • Proven experience with Snowflake, including designing and optimizing data lakes and warehouses.
  • Proficiency in SQL for writing queries, creating views, and managing data structures.
  • Hands-on experience building ETL/ELT pipelines from multiple data sources.
  • Proficiency in Python for scripting, automation, and supporting predictive modeling workflows.
  • Familiarity with data integration tools (e.g., Airflow, dbt, Fivetran) is a plus.
  • Experience with data pipeline monitoring and troubleshooting.
  • Experience implementing data governance frameworks in a Snowflake environment
-

Soft Skills

  • Strong problem-solving skills and attention to detail.
  • Excellent communication skills, with the ability to work collaboratively across teams.
  • A proactive mindset, with a passion for continuous learning and improvement.

-

Experience

  • Bachelor’s degree in Computer Science, Data Engineering, or a related field (or equivalent experience).
  • 3+ years of experience in data engineering or a similar role.
  • Experience working in a cloud-based environment (AWS, Azure, or GCP).
Responsibilities
  • Data Pipeline Architecture & Development
  • Design and implement scalable ETL/ELT pipelines from multiple source systems to Snowflake
  • Create automated, monitored workflows to ensure reliable data processing
  • Optimize pipeline performance and troubleshoot data flow issues
-

Snowflake Development

  • Develop and maintain efficient SQL views and tables to support analytics needs
  • Optimize Snowflake performance through proper warehouse sizing and query tuning
  • Implement data quality checks and validation processes

-

Analytics & ML Support

  • Collaborate with analytics team to ensure data accessibility and usability
  • Structure data to support Python-based predictive modeling initiatives
  • Create efficient data models that enable quick analysis and reporting

-

Continuous Improvement

  • Research and recommend tools, technologies, and practices to enhance the data engineering process.
  • Stay up-to-date with the latest advancements in Snowflake, ETL technologies, and cloud data platforms.

-

Data Quality & Governance

  • Implement and maintain data governance frameworks focusing on security, privacy, and compliance standards
  • Establish automated data quality monitoring with clear documentation and alerting mechanisms
  • Create and enforce data access controls and retention policies while maintaining comprehensive data lineage
  • Partner with stakeholders to ensure regulatory compliance and data security best practices
Loading...