Senior Data Engineer at ITRex Group
, , Ukraine -
Full Time


Start Date

Immediate

Expiry Date

20 Feb, 26

Salary

0.0

Posted On

22 Nov, 25

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Data Engineering, Snowflake, DBT, Airflow, Python, Cloud Services, ETL, Data Governance, Git, CI/CD, Agile, Communication, Analytical Skills, Problem-Solving, Collaboration, Self-Driven

Industry

Software Development

Description
THE PLACE ITRex - AI pioneers who build systems that actually work in the real world, not just in demos. We're 200+ people spread across the US and Europe, creating solutions for companies like P&G and Shutterstock. We keep it simple, build it right, and focus on what works. THE PEOPLE We're the kind of people who don't ignore messages in Slack, who jump in to help when you're stuck on a problem, and who offer solutions instead of blame when things go sideways. We believe in openness, accountability, and having each other's backs. No office politics, no hidden agendas - just people who care about doing good work together and supporting each other to get there. THE ROLE Responsibilities: Analyze existing ETL jobs and data flows implemented in SAP Data Services. Redesign and reimplement data pipelines using DBT, Python, Snowflake stored procedures, and Airflow (AWS MWAA) Support migration of orchestration, scheduling, data models, and business logic from SAP Data Services to modern cloud tooling Build and maintain robust CI/CD pipelines for DBT projects and Airflow DAGs Implement and document best practices for data quality, monitoring, and observability across the data platform Collaborate closely with the internal Data Engineering team to ensure effective knowledge transfer and seamless integration into ongoing operations Technical Skills 5+ years of experience in Data Engineering within cloud-based environments Proven expertise with Snowflake (data modeling, performance optimization, access control) Strong experience with DBT (macros, testing, modular design) Proficiency with Airflow (DAG orchestration, scheduling, dependency management) Strong Python skills (data processing, automation, scripting) Experience with any Cloud Services Solid understanding of data warehouse concepts, ETL/ELT design patterns, and data governance principles Familiarity with Git-based workflows, CI/CD pipelines, and infrastructure-as-code concepts Business & Collaboration Excellent communication skills, both verbal and written, with an ability to convey information clearly and concisely Strong analytical and problem-solving abilities Collaborative mindset, capable of working in cross-functional Agile teams Self-driven and proactive, able to operate independently within a defined framework English proficiency: Upper-intermediate and above Nice to have: Experience migrating legacy ETL tools (preferably SAP Data Services) to modern cloud-native solutions Experience with Apache Iceberg/Data Lake Why people stay First, the foundation: Remote flexibility: Work where and how you work best - we trust you to deliver Fair compensation: Competitive salary + benefits that matter (medical, wellness, learning) Then, the growth: Ownership opportunities: See a problem worth solving? Own it. We back smart risks over bureaucratic safety AI enhancement: We leverage AI to make you faster and stronger - complementing your abilities, not replacing them Learning investment: English classes, professional development, well-being support Career progression: Real paths up, not just sideways shuffling Finally, the people: Responsive teammates: No ignored Slacks, no "not my problem" attitudes Supportive culture: When you're stuck, people help. When things break, we fix them together Human connections: Regular meetups, tech talks, and actual relationships beyond work Curious? We are too. Let's talk
Responsibilities
Analyze and redesign existing ETL jobs and data flows, migrating them to modern cloud tooling. Build and maintain CI/CD pipelines while ensuring data quality and effective collaboration with the Data Engineering team.
Loading...