DBT Developer

at  Luxoft

Polska, , Poland -

Start DateExpiry DateSalaryPosted OnExperienceSkillsTelecommuteSponsor Visa
Immediate26 Dec, 2024Not Specified29 Sep, 2024N/ADocumentation,Data Processing,Access Control,Data Models,Writing,Teams,Data Integration,Jenkins,Snowflake,Data Flow,Data Transformation,Transformations,Etl,Models,Continuous Integration,Data Quality,Model Design,Version Control,Sql,Collaboration,DbtNoNo
Add to Wishlist Apply All Jobs
Required Visa Status:
CitizenGC
US CitizenStudent Visa
H1BCPT
OPTH4 Spouse of H1B
GC Green Card
Employment Type:
Full TimePart Time
PermanentIndependent - 1099
Contract – W2C2H Independent
C2H W2Contract – Corp 2 Corp
Contract to Hire – Corp 2 Corp

Description:

PROJECT DESCRIPTION

The role of a DBT developer is pivotal in automating and optimizing data movement between critical data layers—Staging (STG), Integration (INT), Business (BUS), and Presentation (PRS)—in a data pipeline built on Azure. This role is required to bridge the gap between Azure Data Factory (ADF), which orchestrates the pipeline, and DBT, which executes transformations and ensures data readiness for consumption. ADF lacks the necessary permissions to manage data movement directly, making it essential for DBT to be triggered by ADF for accurate and secure data transformation and movement.

SKILLS

Must have
DBT (Data Build Tool):
Expertise in building, orchestrating, and managing data models using DBT, including deep knowledge of DBT commands, Jinja macros, and configurations.
Ability to define and manage dependencies between data models across layers (STG, INT, BUS, PRS) to ensure smooth orchestration and pipeline execution.
Proficiency in developing and managing DBT projects, performing testing, documenting models, and handling data transformation automation workflows.
Snowflake Expertise:
Proficient in working with Snowflake as the data warehouse solution, leveraging Snowflake’s architecture to optimize DBT pipelines.
Experience in writing and optimizing SQL queries for Snowflake, ensuring efficient use of virtual warehouses, data partitioning, and clustering.
Knowledge of Snowflake roles, access control, and permissions management to ensure secure and seamless data operations across environments.
SQL Proficiency:
Strong SQL skills to write, optimize, and troubleshoot complex queries tailored for Snowflake’s environment.
Ability to design efficient transformations using DBT and SQL, handling large datasets with performance tuning in mind.
Azure Data Platform:
Proficiency with Azure services such as Azure Data Factory (ADF), Azure Synapse Analytics, and Data Lake, ensuring seamless integration between DBT, ADF, and the Snowflake data platform.
Ability to use ADF to orchestrate and automate data pipelines and trigger DBT executions through APIs or custom activities.
Version Control (e.g., Git):
Hands-on experience with Git for version control of DBT models, managing branches, collaborating with teams, and resolving merge conflicts efficiently.
Data Warehousing and ETL Concepts:
Deep knowledge of data warehousing architectures and best practices, including experience in layer-based architecture (STG, INT, BUS, PRS) for organizing data flow.
Expertise in ETL/ELT processes, data transformation, and ensuring consistent data quality across environments.
CI/CD Pipelines:
Experience with continuous integration and deployment (CI/CD) practices, particularly in automating DBT projects using ADF, Jenkins, or other CI tools.
Familiarity with using GitHub Actions, Azure DevOps, or similar tools to automate the build, testing, and deployment of DBT models.
Nice to have
Orchestration Frameworks:
Proficient in setting up DBT orchestration frameworks using ADF, Airflow, or Prefect to schedule and automate DBT runs, ensuring timely data processing across environments.
Hands-on experience in managing orchestration dependencies between DBT, Snowflake, and ADF pipelines, ensuring reliable data flow.
API Integration:
Experience with integrating DBT Cloud or DBT Core with other systems using APIs, including triggering DBT executions via ADF custom activities or REST API calls.
Ability to automate end-to-end data pipelines that span across Snowflake, DBT, and Azure services.
Problem-Solving:
Strong analytical and problem-solving skills, with the ability to troubleshoot and resolve issues in data pipelines, optimize transformations, and manage bottlenecks efficiently.
Collaboration:
Experience working closely with data engineers, analytics teams, and business stakeholders to ensure alignment on data needs, model design, and data quality.
Ability to work cross-functionally, especially with Azure administrators, Snowflake architects, and other DBT developers, to ensure seamless data integration.
Documentation:
Ability to create and maintain comprehensive documentation for DBT models, workflows, and orchestration setups, ensuring all stakeholders are informed of data dependencies and transformations.
Adaptability and Innovation:
Willingness to continuously learn and adapt to emerging tools, techniques, and best practices in data orchestration, automation, and cloud-based data platforms.

Responsibilities:

This DBT-driven automation enhances efficiency, minimizes manual intervention, and ensures that data is processed consistently and is ready for end-user consumption in a timely manner. It plays a crucial role in meeting business intelligence and reporting needs by ensuring that data flows seamlessly across layers, maintaining data integrity, security, and availability.


REQUIREMENT SUMMARY

Min:N/AMax:5.0 year(s)

Information Technology/IT

IT Software - DBA / Datawarehousing

Software Engineering

Graduate

Proficient

1

Polska, Poland