Snowflake Data Engineer at Spyrosoft
Gdańsk, Pomeranian Voivodeship, Poland -
Full Time


Start Date

Immediate

Expiry Date

22 Jan, 26

Salary

0.0

Posted On

24 Oct, 25

Experience

2 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Dbt Cloud, Snowflake, Confluence, JIRA, Azure DevOps, ETL Development, Documentation, Agile, BI Team, Data Warehouse, SCRUM, Data Flows, Data Integration, Healthcare, Data Sphere Program, Code Review

Industry

IT Services and IT Consulting

Description
Tech stack: dbt Cloud (ETL development) Snowflake Confluence JIRA / Azure DevOps Requirements: Great knowledge of dbt Cloud for ETL development Experience with working on Snowflake data warehouse as a target system for ETL Proficiency to prepare clear documentation in the Confluence platform Ability to use ticketing systems such as JIRA and/or Azure DevOps Familiarity with Snowflake infrastructure as an advanced Ability to work in an agile BI team (DevOps) and to share skills and experience Fluency in English Project description: The project is expected to start in Q1 2026 You will play a key role in migrating and building ETL/ELT processes in Snowflake infrastructure under the Data Sphere Program, establishing Snowflake as the primary Data Warehouse platform for Healthcare Commercial. The project will be managed using the SCRUM methodology, ensuring iterative development and close collaboration with all stakeholders. It is conducted in 3-week sprint intervals, with sprint planning tasks assigned to the Contractor and reviewed by the Product Owner at the end of each sprint. Main responsibilities: Developing and optimizing data flows from Source systems to warehouse structures within Snowflake using dbt Cloud Creating documentation on Snowflake / dbt Cloud ETL code created in the Confluence platform Estimating tasks assigned via the ticketing system to and timely resolution of those assigned Participation in Scrum meetings of the Data team to plan the work and allow work review by the Product Owner Consult the project team and end users regarding the code you created to facilitate proper handover Implementing ETL processes specified by the Architects to integrate data sources into Snowflake infrastructure seamlessly.
Responsibilities
You will develop and optimize data flows from source systems to warehouse structures within Snowflake using dbt Cloud. Additionally, you will create documentation on the ETL code in Confluence and participate in Scrum meetings for work planning and review.
Loading...