Middle Data Engineer (GCP, BigQuery) at Exadel Inc (Website)
Wilkowice, Silesian Voivodeship, Poland -
Full Time


Start Date

Immediate

Expiry Date

13 Mar, 26

Salary

0.0

Posted On

13 Dec, 25

Experience

2 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Dataflow, Apache Beam, BigQuery, Cloud Storage, Python, SQL, dbt, Dataform, Data Modeling, CI, CD, Logging, Metrics, Monitoring, Debugging, Event Streaming, Orchestration Tools

Industry

Software Development

Description
Why Join Exadel We’re an AI-first global tech company with 25+ years of engineering leadership, 2,000+ team members, and 500+ active projects powering Fortune 500 clients, including HBO, Microsoft, Google, and Starbucks. From AI platforms to digital transformation, we partner with enterprise leaders to build what’s next. What powers it all? Our people are ambitious, collaborative, and constantly evolving. What You’ll Do Pipeline Engineering Build and maintain data pipelines using Apache Beam and Dataflow under the guidance of senior engineers Develop ingestion patterns across batch or near real-time workflows Write Python and SQL for transformations, validations, and automation tasks Create BigQuery tables with sound partitioning and clustering choices Transformation and Modeling Use dbt or Dataform to manage transformations and testing Contribute to data model implementation following established standards Document logic and assumptions clearly for partner teams Production Operations Support production workloads by monitoring pipelines, analyzing issues, and applying fixes Contribute to performance tuning efforts across BigQuery and Dataflow Participate in the implementation of CI and CD practices for data workflows Collaboration and Growth Work with analysts, scientists, and engineers to understand requirements Participate in code reviews and apply feedback to improve your craft Learn modern GCP approaches through close coordination with senior engineers and architects What You Bring Technical Skills Experience with Dataflow, Apache Beam, BigQuery, Cloud Storage, or similar cloud-native tools Solid proficiency in Python for data tasks and automation Strong SQL skills and a clear understanding of analytic query patterns Experience with dbt or Dataform for transformations and testing Understanding of common data modeling concepts used in analytics environments Engineering Skills Familiarity with CI and CD practices Comfortable working with logging, metrics, and monitoring tools Interest in data quality practices and validation frameworks Strong debugging instincts and patience with iterative problem solving Professional Qualities Clear communication with teammates and partner groups Desire to grow your technical depth through real project experience Steady focus on reliability, clarity, and maintainability Nice to have Experience with Pub/Sub or other event streaming tools Exposure to Dataproc or Spark from legacy environments Familiarity with Vertex AI or ML-related workflows Understanding of orchestration tools such as Composer or Airflow English level Intermediate+ Legal & Hiring Information Exadel is proud to be an Equal Opportunity Employer committed to inclusion across minority, gender identity, sexual orientation, disability, age, and more Reasonable accommodations are available to enable individuals with disabilities to perform essential functions Please note: this job description is not exhaustive. Duties and responsibilities may evolve based on business needs Your Benefits at Exadel Exadel benefits vary by location and contract type. Your recruiter will fill you in on the details. International projects In-office, hybrid, or remote flexibility Medical healthcare Recognition program Ongoing learning & reimbursement Well-being program Team events & local benefits Sports compensation Referral bonuses Top-tier equipment provision Exadel Culture We lead with trust, respect, and purpose. We believe in open dialogue, creative freedom, and mentorship that helps you grow, lead, and make a real difference. Ours is a culture where ideas are challenged, voices are heard, and your impact matters.
Responsibilities
The Middle Data Engineer will build and maintain data pipelines, develop ingestion patterns, and support production workloads. They will also collaborate with analysts and engineers to understand requirements and participate in code reviews.
Loading...