Senior DataOps Engineer

at  PBT Group

Cape Town, Western Cape, South Africa -

Start DateExpiry DateSalaryPosted OnExperienceSkillsTelecommuteSponsor Visa
Immediate22 Jan, 2025Not Specified23 Oct, 2024N/AScala,Apache Spark,Automation,Languages,Code,Processing,Kafka,Orchestration,Storage Solutions,Security,Pipelines,Automation Tools,Python,Infrastructure,Scripting,Java,Version Control,Ansible,Scripting LanguagesNoNo
Add to Wishlist Apply All Jobs
Required Visa Status:
CitizenGC
US CitizenStudent Visa
H1BCPT
OPTH4 Spouse of H1B
GC Green Card
Employment Type:
Full TimePart Time
PermanentIndependent - 1099
Contract – W2C2H Independent
C2H W2Contract – Corp 2 Corp
Contract to Hire – Corp 2 Corp

Description:

We are looking for a Senior DataOps Engineer to join our team and lead the development of DataOps practices within our organisation. As this is a relatively new area for us, we need someone with a strong DevOps background combined with software and data experience. The ideal candidate will be responsible for setting the direction for DataOps within the team, building scalable data pipelines, and ensuring efficient data operations in a fully cloud-based environment (AWS). This role is ideal for a senior software professional with a deep understanding of DevOps practices and a solid foundation in data.

QUALIFICATIONS & EXPERIENCE

  • Proven experience as a DevOps Engineer, with significant exposure to data-focused environments.
  • Strong understanding of data concepts, including data pipelines, ETL/ELT processes, and data storage solutions.
  • Expertise in building, deploying, and managing data pipelines in cloud environments, especially AWS.
  • Familiarity with AWS services such as S3, Redshift, Lambda, RDS, and other relevant cloud-based data tools.
  • Experience with CI/CD pipelines, infrastructure-as-code, and automation tools (e.g., Jenkins, Terraform, Ansible).
  • Strong knowledge of version control systems (e.g., Git) and containerisation technologies (e.g., Docker, Kubernetes).

PREFERRED SKILLS

  • Experience with DataOps principles and practices, including data versioning, pipeline automation, and data observability.
  • Strong programming skills in languages such as Python, Java, or Scala, with experience in building data processing solutions.
  • Familiarity with data processing frameworks like Apache Spark, Kafka, or similar.
  • Exposure to security best practices for data storage and processing in cloud environments.

TECHNICAL SKILLS

  • DevOps Tools: Strong understanding of CI/CD tools, infrastructure automation, and version control.
  • Cloud Expertise: Deep familiarity with AWS cloud infrastructure, including automation and orchestration of services.
  • Data Management: Knowledge of modern data architectures, pipelines, ETL/ELT processes, and relevant technologies.
  • Scripting & Automation: Proficiency in automating tasks using scripting languages (Python, Bash) and DevOps tools.

Responsibilities:

  • Lead DataOps Strategy:
  • Define and implement DataOps practices, guiding the team towards a more automated and efficient data pipeline management system.
  • Collaborate with data engineers, data scientists, and other stakeholders to establish best practices for data workflows and infrastructure.
  • Build & Maintain Data Pipelines:
  • Design, deploy, and manage scalable and reliable data pipelines to enable efficient data flow and processing across cloud environments.
  • Automate and optimise ETL/ELT processes, ensuring seamless data integration from various sources.
  • DevOps for Data:
  • Apply DevOps principles to data workflows, focusing on continuous integration, continuous delivery (CI/CD), and infrastructure-as-code (IaC) for data operations.
  • Implement version control and automation for data-related processes, reducing errors and improving data quality.
  • Cloud Infrastructure Management (AWS):
  • Manage and optimise AWS cloud infrastructure to support data workflows, including S3, Redshift, Lambda, and RDS.
  • Monitor, maintain, and scale AWS resources for efficient data storage, processing, and analysis.
  • Data Security & Compliance:
  • Ensure data security and compliance with relevant industry standards and regulations, implementing appropriate security protocols and monitoring.
  • Collaborate with security teams to ensure secure handling of sensitive data and manage access controls effectively.
  • Monitoring & Optimisation:
  • Set up monitoring, logging, and alerting mechanisms for data pipelines to ensure high availability and performance.
  • Identify bottlenecks and inefficiencies in data processes, proposing and implementing optimisations.
  • Collaboration & Mentorship:
  • Provide technical leadership and mentorship to junior team members, helping them to grow in their roles and expand their knowledge of DataOps practices.
  • Work closely with data engineering, software development, and IT teams to drive cross-functional initiatives.


REQUIREMENT SUMMARY

Min:N/AMax:5.0 year(s)

Information Technology/IT

IT Software - DBA / Datawarehousing

Software Engineering

Graduate

Proficient

1

Cape Town, Western Cape, South Africa