Senior Data Infrastructure Engineer at Gini Apps
Tel Aviv, Tel-Aviv District, Israel -
Full Time


Start Date

Immediate

Expiry Date

20 Jun, 26

Salary

0.0

Posted On

22 Mar, 26

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Snowflake, Databricks, BigQuery, Redshift, Python, Spark, PySpark, Data Lakes, Data Pipelines, ETL/ELT, Data Modeling, Monitoring, Security, Governance, Airflow, Dagster

Industry

Software Development

Description
We are looking for a Senior Data Infrastructure Engineer to lead the design, build, and optimization of a modern data platform. The role involves hands-on work with cloud-based data technologies, building data lakes from scratch, and managing large-scale data pipelines while ensuring high performance, cost efficiency, and reliability. You will collaborate closely with data engineers, data science, analytics, and product teams to support business needs. Key Responsibilities: Design and build scalable data lakes / platforms using technologies such as Snowflake, Databricks, BigQuery, or Redshift Develop and optimize large-scale data pipelines for batch and streaming use cases Ensure high performance, scalability, and cost efficiency across data systems Work with complex data workflows, AI models, transformations, and orchestration Apply best practices in data modeling, monitoring, security, and governance Requirements 5+ years in data engineering or data infrastructure roles Proven experience building modern data platforms or data lakes from scratch Strong Python programming skills and experience with Spark / PySpark Knowledge of distributed systems and cloud-based architectures Experience with ETL/ELT processes and handling data at scale Nice to Have: Experience with cloud providers (AWS, GCP, Azure) Familiarity with orchestration tools (Airflow, Dagster) Knowledge of data governance, security, and access control Experience supporting analytics, BI, or machine learning workloads What We Offer: Ownership of end-to-end modern data platforms Opportunity to tackle high-impact, large-scale data challenges Collaborative, professional engineering environment Competitive compensation and benefits
Responsibilities
The role involves leading the design, building, and optimization of a modern data platform, including hands-on work with cloud technologies and building data lakes from scratch. Responsibilities include developing and optimizing large-scale data pipelines while ensuring system performance, cost efficiency, and reliability.
Loading...