Middle Data Engineer at Globaldev Group
, , Armenia -
Full Time


Start Date

Immediate

Expiry Date

21 Dec, 25

Salary

0.0

Posted On

22 Sep, 25

Experience

2 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Data Engineering, SQL, Python, Snowflake, Airflow, ETL, ELT, Data Warehousing, API Integrations, Git, CI/CD, JSON Handling, Performance Tuning, Monitoring, Documentation, Incident Management

Industry

IT Services and IT Consulting

Description
We’re hiring a Middle Data Engineer to build and operate reliable ELT/ETL pipelines on Snowflake + Airflow. You’ll consolidate data from databases, REST APIs, and files into trusted, documented datasets with clear SLAs and ownership. Responsibilities: Ingest data from RDBMS/APIs/files into Snowflake (batch/incremental; CDC when applicable). Build modular SQL/Python transformations; handle semi‑structured JSON; publish consumer‑read tables/views. Orchestrate Airflow DAGs (dependencies, retries, backfills, SLAs) with monitoring and alerting. Ensure idempotent re‑runs/backfills; maintain runbooks and perform RCA for incidents. Tune performance and cost in Snowflake (warehouse sizing, pruning; clustering when justified). Partner with BI/Analytics to refine definitions and SLAs for delivered datasets. Requirements: 2–4 years building production ETL/ELT; strong SQL (joins, window functions) + Python for data tooling. Snowflake hands‑on: Streams/Tasks/Time Travel; performance and cost basics; JSON handling. Airflow proficiency: reliable DAGs, retries/backfills, SLAs; monitoring & alert routing. Data warehousing/modeling (Kimball/3NF), schema evolution; API integrations (auth, pagination, rate limits, idempotency). Git‑based CI/CD; clear written English; privacy/GDPR basics. Will be a plus: iGaming familiarity: stakes, wins, GGR/NGR, RTP, retention/ARPDAU, funnels; RG/regulatory awareness. AI and automation interest/experience: Snowflake Cortex for auto‑documentation, semantic search over logs/runbooks, or parsing partner PDFs (with guardrails). Exposure to cloud storage (GCS/S3/ADLS), Terraform/Docker, and BI consumption patterns (Tableau/Looker/Power BI). What we offer: Direct cooperation with the already successful, long-term, and growing project. Flexible work arrangements. 20 days of vacation. Truly competitive salary. Help and support from our caring HR team.

How To Apply:

Incase you would like to apply to this job directly from the source, please click here

Responsibilities
The Middle Data Engineer will build and operate reliable ELT/ETL pipelines on Snowflake and Airflow. Responsibilities include ingesting data from various sources into Snowflake and orchestrating Airflow DAGs with monitoring and alerting.
Loading...