Data Engineer at Haystack
Liverpool, England, United Kingdom -
Full Time


Start Date

Immediate

Expiry Date

25 Oct, 25

Salary

0.0

Posted On

26 Jul, 25

Experience

0 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Accountability, Ownership, Security, Dbt, Technical Ability, Microsoft Azure, Azure, Json, Emerging Technologies

Industry

Information Technology/IT

Description

Hybrid requirements: This role has flexible working patterns.
We’re on the hunt for builders . No, we’ve not ventured into construction in our quest to conquer the world, rather a designer and builder of systems for all things data related where we are conquering the Data World.

We can offer an interesting insight into projects spanning a variety of sectors, which may include industries such as telecoms, insurance, finance and mortgages.

  • ️ First and foremost we seek strong Consultants; so if you are ready to explore our dynamic team where you can truly act as an expert in your field in support of our clients and their challenges in the world of data and technology, read on!

THE TEAM

We’re Intuita – a fast growing consultancy that’s making waves in both the consultancy and technology space. With our ambitious goals for 2024 and beyond, we are looking for talented individuals to complement the team of experts we already have working across our business, becoming a pivotal part of our journey, to not just meet, but continuously exceed our client expectations!

‍ A BIT ABOUT YOU!

As much as we just love working with great, fun people, there are some obvious required Skills and Experience we are going to be seeking out. For this role we’d be expecting to see:

  • Solid experience in Azure, specifically Azure Databricks and Azure SQL/SQL Server.
  • Proficiency in SQL and Python languages.
  • Hands-on experience in designing and building data pipelines using Azure Data Factory (ADF).
  • Familiarity with building metadata driven pipelines.
  • Knowledge of Azure Storage, Medallion Architecture, and working with data formats such as JSON, CSV, and Parquet.
  • Strong understanding of IT concepts, including security, IAM, Key Vault, and networking.
  • Exposure to Apache Airflow and DBT is a bonus.
  • Familiarity with agile principles and practices.
  • Experience with Azure DevOps pipelines.
Responsibilities

THE ROLE

We are seeking a skilled Data Engineer to join our team, ideally initially on a contract basis, however longer term there is potential to discuss longer term opportunities in the shape of a permanent consultant opportunity. This position is contingent upon winning new business opportunities that are currently in our pipeline, but we can move fast and adapt our selection period to our needs. With this in mind, immediate or quick availability is ideal but not the sole factor we’ll consider.
This is a typical Data Engineering role in that day to day you will be ingesting large data sets for our clients, which involves designing data pipelines, data integration and transforming data into models for downstream consumption.

KEY OUTPUTS FOR THE ROLE

  • Develop and maintain data pipelines using Azure Data Factory (ADF), ensuring efficient and reliable data movement and transformation.
  • Data Modelling using Kimball, 3NF or Dimensional methodologies
  • Utilize SQL and Python languages to extract, transform, and load data from various sources into Azure Databricks and Azure SQL/SQL Server.
  • Design and implement metadata driven pipelines to automate data processing tasks.
  • Collaborate with cross-functional teams to understand data requirements and implement appropriate solutions.
  • Ensure data quality, integrity, and security throughout the data pipeline.
  • Troubleshoot and resolve any issues or errors in the data pipelines.
  • Stay updated with the latest Azure technologies and industry best practices to continuously enhance data engineering capabilities.

As much as we just love working with great, fun people, there are some obvious required Skills and Experience we are going to be seeking out. For this role we’d be expecting to see:

  • Solid experience in Azure, specifically Azure Databricks and Azure SQL/SQL Server.
  • Proficiency in SQL and Python languages.
  • Hands-on experience in designing and building data pipelines using Azure Data Factory (ADF).
  • Familiarity with building metadata driven pipelines.
  • Knowledge of Azure Storage, Medallion Architecture, and working with data formats such as JSON, CSV, and Parquet.
  • Strong understanding of IT concepts, including security, IAM, Key Vault, and networking.
  • Exposure to Apache Airflow and DBT is a bonus.
  • Familiarity with agile principles and practices.
  • Experience with Azure DevOps pipelines
Loading...