Data Engineering - Snowflake at DXC Technology
Erskine, , United Kingdom -
Full Time


Start Date

Immediate

Expiry Date

06 Dec, 25

Salary

0.0

Posted On

07 Sep, 25

Experience

1 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Scripting, Technology, Dbt, Processing, Automation, Sc Clearance, Data Visualization, Exploratory Data Analysis, Code, Infrastructure, Snowflake

Industry

Information Technology/IT

Description

REQUIRED SKILLS & EXPERIENCE:

  • Proven experience with:
  • Build and maintain robust data pipelines using Snowpipe and COPY INTO.
  • Develop modular, testable data transformations with dbt.
  • Orchestrate workflows and manage dependencies using Apache Airflow.
  • Leverage Snowpark for advanced data applications and processing.
  • Use Terraform to manage infrastructure as code.
  • Implement CI/CD pipelines to streamline deployments.
  • Interact with Snowflake using snowsql for scripting and automation.
  • Write and optimize Advanced ANSI SQL queries, including window functions.
  • Continuously improve performance through query and system optimization.
  • SnowPro Core Certification or willingness to achieve one.
  • Excellent problem-solving skills and a passion for data and AI
  • Bachelor’s degree in a relevant field or equivalent combination of education and experience
  • Typically, 4+ years of relevant work experience in industry, with a minimum of 1+ years in a similar role.
  • Proficiencies in data cleansing, exploratory data analysis, and data visualization
  • Continuous learner that stays abreast with industry knowledge and technology
Responsibilities

ABOUT THE ROLE:

DXC is a forward-thinking organisation at the forefront of data innovation. We harness the power of modern data platforms to deliver actionable insights and drive strategic decisions. We’re looking for a Data Engineer with deep expertise in Snowflake to join our growing team.

KEY RESPONSIBILITIES:

  • Build and maintain robust data pipelines using Snowpipe and COPY INTO.
  • Develop modular, testable data transformations with dbt.
  • Orchestrate workflows and manage dependencies using Apache Airflow.
  • Leverage Snowpark for advanced data applications and processing.
  • Use Terraform to manage infrastructure as code.
  • Implement CI/CD pipelines to streamline deployments.
  • Interact with Snowflake using snowsql for scripting and automation.
  • Write and optimize Advanced ANSI SQL queries, including window functions.
  • Continuously improve performance through query and system optimization.
  • Collaborate with cross-functional teams to deliver AI-driven solutions
  • Ensure best practices in data engineering and contribute to architectural decisions
  • Support senior team members in identifying and addressing data science opportunities.
Loading...