Lead Data DevOps Engineer
at Epam Systems
Desde casa, Cauca, Colombia -
Start Date | Expiry Date | Salary | Posted On | Experience | Skills | Telecommute | Sponsor Visa |
---|---|---|---|---|---|---|---|
Immediate | 31 Jan, 2025 | USD 200 Annual | 01 Nov, 2024 | 1 year(s) or above | Kubernetes,Python,Mechanisms,Scripting Languages,Code,Ansible,Icmp,Apache Spark,Azure,Automation Tools,Apache Kafka,Powershell,Infrastructure,Docker,Aws,Bash,Automation | No | No |
Required Visa Status:
Citizen | GC |
US Citizen | Student Visa |
H1B | CPT |
OPT | H4 Spouse of H1B |
GC Green Card |
Employment Type:
Full Time | Part Time |
Permanent | Independent - 1099 |
Contract – W2 | C2H Independent |
C2H W2 | Contract – Corp 2 Corp |
Contract to Hire – Corp 2 Corp |
Description:
We are currently seeking a highly skilled Lead Data DevOps Engineer to join our remote team.
In this role, you will be responsible for leading the design, implementation, and maintenance of Data DevOps solutions. Collaborating closely with customers, peers, and vendors, you will resolve complex issues, develop strategic solutions, and ensure the maintenance of technical standards. This position presents an exciting opportunity to make a significant impact on the growth and success of our organization.
We accept CVs in English only.
REQUIREMENTS
- Minimum of 5 years of relevant professional experience
- Expertise in leading and managing complex engineering projects, with at least 1 year of leadership experience
- Demonstrated experience in designing and implementing Data DevOps solutions
- Strong understanding of CI/CD pipelines, automation, and infrastructure as code
- Proficiency in cloud platforms such as AWS, Azure, or GCP
- Experience with containerization and orchestration technologies, such as Docker and Kubernetes
- Knowledge of scripting languages and automation tools like Python, PowerShell, or Bash
- Solid understanding of network protocols and mechanisms, such as TCP, UDP, ICMP, DHCP, DNS, and NAT
- Familiarity with database technologies and data management best practices
- Infrastructure as Code experience with tools like Ansible, Terraform, or CloudFormation
- Experience installing and configuring data tools such as Apache Spark, Apache Kafka, ELK Stack, Apache NiFi, Apache Airflow, or similar tools
- Strong SQL skills
- Professional mastery of the Linux operating system
- Ability to collaborate effectively with cross-functional teams and stakeholders
- Upper-Intermediate English level
Responsibilities:
- Lead the design, implementation, and maintenance of Data DevOps solutions
- Design, deploy, and manage data infrastructure in the cloud, primarily using one of the major cloud platforms such as AWS, Azure, or GCP
- Collaborate with cross-functional teams to establish and implement engineering best practices
- Develop and maintain CI/CD pipelines for data infrastructure and applications
- Automate and streamline data-related processes to ensure scalability and reliability
- Provide technical leadership and mentorship to engineering teams
- Ensure the security, availability, and performance of data platforms
- Install, configure, and maintain data tools such as Apache Spark, Apache Kafka, ELK Stack, Apache NiFi, Apache Airflow, or similar tools in both on-premises and cloud environments
- Monitor and troubleshoot data systems, proactively identifying and resolving performance, scalability, and reliability issues
- Collaborate with stakeholders to understand and address their data infrastructure needs
REQUIREMENT SUMMARY
Min:1.0Max:5.0 year(s)
Information Technology/IT
IT Software - Other
Software Engineering
Graduate
Proficient
1
Desde casa, Colombia