Data Analyst/Engineer (Databricks)

at  NiX

Kraków, małopolskie, Poland -

Start DateExpiry DateSalaryPosted OnExperienceSkillsTelecommuteSponsor Visa
Immediate25 Dec, 2024Not Specified26 Sep, 20243 year(s) or aboveGitlab,Apache Spark,Azure,Snowflake,Cloud,Databases,Python,Relational Databases,Sql,Communication SkillsNoNo
Add to Wishlist Apply All Jobs
Required Visa Status:
CitizenGC
US CitizenStudent Visa
H1BCPT
OPTH4 Spouse of H1B
GC Green Card
Employment Type:
Full TimePart Time
PermanentIndependent - 1099
Contract – W2C2H Independent
C2H W2Contract – Corp 2 Corp
Contract to Hire – Corp 2 Corp

Description:

We are looking for Data Analyst/Engineer (Databricks) to join our team.
Our client is a leading SaaS provider that specializes in developing solutions to help users visualize, navigate, collaborate, and manage digital representations of physical assets. Their platform empowers industries to work efficiently by providing a comprehensive digital workspace for asset management.

REQUIREMENTS:

  • Experience: Minimum of 3 years as a data analyst, engineer, or in a relevant field.
  • Python Proficiency: Advanced experience in Python, especially in delivering production-grade data pipelines.
  • Cloud & Containers: Experience in developing and/or deploying services on cloud platforms, and familiarity with container technologies (e.g., Docker).
  • Cloud Platforms: Knowledge of Azure.
  • Data Platforms: Experience with Apache Spark, Databricks, Snowflake, or other relevant platforms.
  • Databricks: Hands-on experience.
  • Databases: Knowledge of document, graph, and relational databases, along with proficiency in SQL.
  • Ability to write and manage integration tests.
  • Experience with Gitlab or equivalent tools.
  • English Proficiency: B2 level or higher
  • Strong collaboration and communication skills; experience working in an international team environment.

Responsibilities:

  • Design, build, and maintain production-grade data pipelines using Python.
  • Collaborate with an international English-speaking team to develop scalable data solutions.
  • Integrate and manage APIs for data retrieval and processing.
  • Write integration tests to ensure the quality and reliability of data services.
  • Work with Gitlab or similar version control systems to manage code and collaborate with team members.
  • Utilize Databricks for data processing and management.


REQUIREMENT SUMMARY

Min:3.0Max:8.0 year(s)

Information Technology/IT

IT Software - Other

Software Engineering

Graduate

Proficient

1

Kraków, małopolskie, Poland