Data Engineer at Qualysoft
Budapest, , Hungary -
Full Time


Start Date

Immediate

Expiry Date

18 Sep, 25

Salary

0.0

Posted On

20 Jun, 25

Experience

0 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Python, Git, Integration, Data Modeling

Industry

Information Technology/IT

Description

Founded in 1999 in Vienna, the Qualysoft Group is a manufacturer-independent IT consulting and services company, which successfully provides support for its international customers with the aim of boosting their competitiveness and economic efficiency through innovative IT solutions.
Its focus is on financial services providers, telecommunications companies, the automotive industry and energy service providers. Over 400 employees in 6 subsidiaries work together to ensure state of the art solutions for our clients.
We are looking for new colleagues in Qualysoft teams for diverse projects providing continuous learning opportunities. Our common goal is to provide honesty, development and a stable background while getting to know the latest technologies. We are waiting for your application for the position below!
Join a collaborative, forward-thinking environment where your ideas matter, innovation is encouraged, and cutting-edge technology drives everything we do.

REQUIREMENTS

  • Proficient in Python, with a focus on clean, efficient, and maintainable code
  • Hands-on experience with DataBricks and cloud-based data engineering tools
  • Skilled in Snowflake or other cloud data warehousing platforms
  • Solid understanding of ETL principles, data modeling, and integration best practices
  • Comfortable working in agile, fast-paced, collaborative environments
  • Experienced with Git and version control systems
  • Detail-oriented with a strong problem-solving mindset
  • Familiar with Linux systems and REST API integrations
Responsibilities
  • Design and develop scalable, reliable ETL processes using Python and DataBricks
  • Build and maintain data pipelines for extracting, transforming, and loading data from various sources
  • Take ownership of the full data engineering lifecycle, from extraction to transformation and loading
  • Optimize data workflows, ensuring robust error handling, monitoring, and performance tuning
  • Work within an agile environment, actively participating in sprint planning, stand-ups, and retrospectives
  • Conduct code reviews and maintain high coding standards
  • Develop tooling and automation scripts to improve operational efficiency
  • Implement comprehensive testing for data pipelines, including unit and integration tests
  • Integrate data sources via REST APIs and other techniques
  • Maintain up-to-date technical documentation and data flow diagrams
Loading...