Senior DevOps Engineer at BIPROCSI
London W1F 7TY, , United Kingdom -
Full Time


Start Date

Immediate

Expiry Date

03 Sep, 25

Salary

0.0

Posted On

04 Jun, 25

Experience

0 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Infrastructure, Apache Kafka, Communication Skills, Privacy Regulations, Bash, Git, Jenkins, Python, Cloud Storage, Project Delivery, Languages, Analytical Skills, Complex Systems, Apache Spark

Industry

Information Technology/IT

Description

OVERVIEW

We are seeking a highly experienced DevOps Engineer with a strong background in Google Cloud Platform (GCP) and a proven track record in delivering complex data analytics projects for clients. In this full-time, permanent role, you will be responsible for designing, implementing, and managing the infrastructure and deployment processes that drive successful client engagements. You will work as part of a consultancy team, ensuring that each client engagement benefits from a robust, scalable, and secure cloud environment.

EXPERIENCE & QUALIFICATIONS

  • Proven experience as a DevOps Engineer/Consultant with a history of successful client project delivery.
  • Extensive hands-on experience with GCP services such as BigQuery, Cloud Storage, Dataflow, Pub/Sub, Dataproc, and Cloud Composer.
  • Strong programming and scripting skills in languages like Python, Bash, or Go to automate tasks and build necessary tools.
  • Expertise in designing and optimising data pipelines using frameworks like Apache Airflow or equivalent.
  • Demonstrated experience with real-time and batch data processing frameworks, including Apache Kafka, Apache Spark, or Google Cloud Dataflow.
  • Proficiency in CI/CD tools such as Jenkins, GitLab CI/CD, or Cloud Build, along with a strong command of version control systems like Git.
  • Solid understanding of data privacy regulations and experience implementing robust security measures.
  • Familiarity with infrastructure as code tools such as Terraform or Deployment Manager.
  • Excellent problem-solving and analytical skills, with the ability to architect and troubleshoot complex systems across diverse client projects.
  • Strong communication skills, enabling effective collaboration with both technical and non-technical client stakeholders.
Responsibilities
  • Design and implement scalable, reliable GCP infrastructures tailored to each client’s unique project requirements, ensuring high performance, availability, and security.
  • Work closely with client stakeholders, full-stack developers, data engineers, and data scientists to define and execute efficient data ingestion, processing, and storage solutions that meet project deliverables.
  • Implement and automate client-specific deployment processes using CI/CD pipelines and configuration management tools, enabling rapid and reliable software releases in a consultancy environment.
  • Develop processes around release management, testing, and automation to ensure successful project delivery, adhering to client timelines and quality standards.
  • Implement and manage real-time and batch data processing frameworks (e.g., Apache Kafka, Apache Spark, Google Cloud Dataproc) in line with project needs.
  • Build and maintain robust monitoring, logging, and alerting systems for client projects, ensuring system health and performance are continuously optimised and cost-efficient.
  • Ensure each client’s project complies with data privacy regulations by implementing appropriate access controls and data encryption measures
  • Troubleshoot and resolve complex technical challenges related to infrastructure, data pipelines, and overall application performance during client engagements.
  • Remain updated on industry trends and best practices in DevOps, data engineering, and cloud technologies, with a particular focus on GCP, to provide cutting-edge solutions to our clients.
Loading...