Snowflake Data Engineer | AI & Data Team at Deloitte
Budapest, Közép-Magyarország, Hungary -
Full Time


Start Date

Immediate

Expiry Date

29 Apr, 25

Salary

0.0

Posted On

30 Jan, 25

Experience

3 year(s) or above

Remote Job

No

Telecommute

No

Sponsor Visa

No

Skills

Python, Spark, Data Warehouse, Java, Scala, English, Relational Databases, Sql, Reporting Technologies, German, Data Solutions, Data Engineering

Industry

Information Technology/IT

Description

GENERAL INFORMATION

Position
Snowflake Data Engineer | AI & Data Team
Work arrangement
Full-time
City
Budapest
Country
Hungary
Department
Consulting
Team
Strategy, Analytics and M&A
Area of interest
Cloud
Way of work
Hybrid

WHO WE ARE LOOKING FOR

We are looking for a Data Engineer focusing on Snowflake to join our international AI & Data team. Do you want to build analytical solutions in Cloud? Are you enthusiastic about cloud and data in general? Then you are just the person we are looking for.
Our team delivers cutting-edge solutions in big data analytics, advanced analytics, integrations, machine learning, and artificial intelligence to both external and internal clients. We empower our clients to unlock the full potential of their data, driving informed decision-making and business success. Additionally, we design, develop, and deploy our own cloud-based products to meet evolving market needs.
What do we expect from you :
-

3+ years of relevant experience in data engineering or related fields.

  • SnowPro Core Certification is required; SnowPro Advanced Certification is a plus.
  • Proficient in SQL and Python with hands-on experience in relational databases and familiarity with non-relational databases.
  • Strong understanding of Snowflake components and its integration with various data processing and reporting technologies.
  • Proven experience in designing and implementing production-grade data solutions on Snowflake Data Warehouse.
  • Practical experience in building production-ready data ingestion and processing pipelines using Java, Spark, Scala, or Python.
  • Self-motivated with a passion for continuous learning and exploring new technologies.
  • Effective team player with a proactive attitude and a willingness to take on challenges.
  • Excellent written and verbal communication skills in English; knowledge of German is an advantage.
Responsibilities
  • Develop and optimize ETL pipelines using Python and SnowSQL, including writing advanced SQL queries against the Snowflake Data Warehouse.
  • Assist in designing an optimal data delivery architecture on Snowflake to support diverse, data-driven use cases.
  • Support clients in their digital transformation journey, from crafting data strategy concepts to architecting and implementing specific use cases.
  • Create and maintain comprehensive project documentation and technical diagrams.
  • Collaborate with clients, internal teams, and vendors to deliver successful outcomes across a variety of projects.
  • Engage in continuous professional development through a wide range of training opportunities.
Loading...