DataOps Engineer at RedCloud
London EC2Y, , United Kingdom -
Full Time


Start Date

Immediate

Expiry Date

02 Dec, 25

Salary

0.0

Posted On

02 Sep, 25

Experience

3 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Airflow, Analytical Skills, Computer Science, Relational Databases, Sql, Soft Skills, Azure, Google Cloud, Data Engineering, Information Technology, Snowflake, Aws, Github, Data Warehousing, Perspectives, Python, Data Services, Jenkins, Data Quality

Industry

Information Technology/IT

Description

ABOUT REDCLOUD

We are revolutionizing B2B commerce by delivering scalable, innovative solutions that empower businesses to grow and thrive in a competitive landscape. Our AI powered platform streamlines logistics, payments, and supply chain operations, making it easier for companies to connect, transact, and succeed. With a diverse and dynamic team, we’re committed to driving impact, fostering collaboration, and shaping the future of global trade.

REQUIREMENTS:

  • Bachelor’s degree in Computer Science, Information Technology, Engineering, or a related field. Master’s degree is a plus.
  • 3+ years of experience in DataOps, Data Engineering, or a related role.
  • Proven experience with data pipeline tools (Airflow).
  • Hands-on experience with cloud platforms (AWS, Azure, Google Cloud) and data services (Snowflake).
  • Proficiency in SQL and experience with relational databases.
  • Strong programming skills in Python.
  • Experience with IaC and CI/CD tools (Terraform, Jenkins, GitHub, Concourse CI).
  • Knowledge of data warehousing and ETL/ELT processes.

SOFT SKILLS:

  • Excellent problem-solving and analytical skills.
  • Strong communication and collaboration skills.
  • Ability to work independently and as part of a team.
  • Attention to detail and a commitment to data quality.
Responsibilities

THE ROLE:

The DataOps Engineer is responsible for the development, implementation, and maintenance of a data platform serving the company’s data lake: data acquisition, transformation pipelines, presentation, data access and supporting infrastructure. This role focuses on optimizing data flow and collection for cross-functional teams. The ideal candidate has a deep understanding of data engineering, its principles and best practices, DevOps and software development principles, and is committed to enhancing data management processes.

KEY RESPONSIBILITIES:

  • Pipeline Development & Management:
  • Design, build, and maintain scalable and reliable data pipelines.
  • Automate data ingestion, transformation, and delivery processes.
  • Ensure data quality, integrity, and reliability throughout the data lifecycle.
  • Infrastructure Management:
  • Manage cloud data infrastructure.
  • Implement and optimize data storage solutions.
  • Monitor and ensure the performance, scalability, and security of data platforms.
  • Setting up alerts to detect anomalies
  • Collaboration & Support:
  • Collaborate with data scientists, analysts, Engineers and other stakeholders to understand data requirements and deliver optimal solutions.
  • Provide support for data-related issues and troubleshooting.
  • Assist in the development and implementation of data governance and security policies including access management.
  • Continuous Improvement:
  • Implement CI/CD processes for data workflows.
  • Optimize existing processes and frameworks for better performance and cost-efficiency.
  • Stay updated with the latest industry trends and technologies to continuously improve data operations.
Loading...