Data & Analytics Engineer

at  Cubane Solutions AB

Stockholm, Stockholms län, Sweden -

Start DateExpiry DateSalaryPosted OnExperienceSkillsTelecommuteSponsor Visa
Immediate20 Nov, 2024Not Specified22 Aug, 2024N/ADatabase Systems,Apache Spark,Data Warehousing,Containerization,Azure,Dbt,Docker,Programming Languages,Pandas,Sql,Kafka,Aws,YamlNoNo
Add to Wishlist Apply All Jobs
Required Visa Status:
CitizenGC
US CitizenStudent Visa
H1BCPT
OPTH4 Spouse of H1B
GC Green Card
Employment Type:
Full TimePart Time
PermanentIndependent - 1099
Contract – W2C2H Independent
C2H W2Contract – Corp 2 Corp
Contract to Hire – Corp 2 Corp

Description:

We are looking for an experienced Data & Analytics Engineer who can participate in developing and maintaining data products and assets required to deliver business value to diverse usage archetypes. The consultant should have prior experience in working with data heavy digital solutions and the logistics industry. Be able to participate in discussions and workshops focusing on a wide range of aspects related to the modus operandi of PostNord as a company, and how the proposed changes fit within the existing system landscape. Our new data platform is built on the Azure/Databricks tech stack, therefore prior experience in working with the Azure/Databricks tech stack, and experience in working with modern data platforms is important. Previous experience working in the logistics industry as well as with first &
last mile delivery planning and route optimization is a plus. Where relevant, PostNord has decided that Microsoft’s Power BI platform will be used as the main platform for creating, sharing and presenting data. There may also be other scenarios involving full stack applications and/or data science workflows.

REQUIREMENTS

  • Proficiency in data modelling techniques (dimensional model, normalisation).
  • Proficiency in database systems, data warehousing and distributed computing concepts.
  • Proficiency in programming languages e.g. Python, SQL and data processing framework/libraries (Apache Spark, Pandas).
  • Proficiency with data pipeline orchestration tools (Azure Data factory, Databricks, Azure Logic Apps, Apache Airflow, DBT).
  • Proficiency of working with Code Repositories (GIT) and CI/CD Pipelines.
  • Proficiency in integrations serving and consuming Rest API’s and Kafka.
  • Proficiency in containerization (Docker, Kubernetes/OpenShift, Azure Functions).
  • Experience of deploying infra resource with YAML and Terraform.
  • Familiarity with cloud platforms (one of Azure, GCP, AWS).

Responsibilities:

  • Develop and implement logical data models.
  • Optimize data schema for read and write performance and storage.
  • Design, develop and maintain data pipelines for data ingestion and transformation.
  • Ensure data quality and consistency in developed data products.
  • Set up and manage infrastructure resources for building data pipelines including database, cloud storage resources, data pipeline run time environments.
  • Implement data governance and regulatory compliance policies.


REQUIREMENT SUMMARY

Min:N/AMax:5.0 year(s)

Information Technology/IT

IT Software - DBA / Datawarehousing

Software Engineering

Graduate

Proficient

1

Stockholm, Sweden