Senior Data DevOps Engineer
at Epam Systems
Desde casa, Cauca, Colombia -
Start Date | Expiry Date | Salary | Posted On | Experience | Skills | Telecommute | Sponsor Visa |
---|---|---|---|---|---|---|---|
Immediate | 31 Jan, 2025 | USD 200 Annual | 01 Nov, 2024 | 2 year(s) or above | Sql,Mechanisms,Jenkins,Infrastructure,Docker,Automation Tools,Scripting Languages,Apache Spark,Apache Kafka,Communication Skills,Kubernetes,Bamboo,Python,Ansible,Icmp,Azure,Powershell,Aws,Bash,Teamcity,English | No | No |
Required Visa Status:
Citizen | GC |
US Citizen | Student Visa |
H1B | CPT |
OPT | H4 Spouse of H1B |
GC Green Card |
Employment Type:
Full Time | Part Time |
Permanent | Independent - 1099 |
Contract – W2 | C2H Independent |
C2H W2 | Contract – Corp 2 Corp |
Contract to Hire – Corp 2 Corp |
Description:
We are looking for a remote Senior Data DevOps Engineer to join our dynamic team.
In this role, you will play a crucial part in defining and implementing the architecture of our projects. Working closely with customers, peers, and vendors, you will resolve complex issues, develop strategic solutions, maintain technical standards and drive continuous improvement and innovation in data engineering practices.
We accept CVs in English only.
REQUIREMENTS
- 3+ years of relevant professional experience
- Expertise in designing and implementing Data DevOps solutions
- Strong proficiency in cloud platforms: Azure, GCP, or AWS
- Extensive experience with Infrastructure as Code tools like Ansible, Terraform, or CloudFormation
- Ability to set up and manage CI/CD pipelines using popular tools like Jenkins, Bamboo, TeamCity, GitLab CI, or GitHub Actions
- Proficiency in scripting languages and automation tools such as Python, PowerShell, or Bash
- Solid understanding of containerization and orchestration technologies like Docker and Kubernetes
- Experience installing, configuring, and optimizing data tools like Apache Spark, Apache Kafka, ELK Stack, Apache NiFi, Apache Airflow etc
- In-depth knowledge of network protocols and mechanisms, including TCP, UDP, ICMP, DHCP, DNS, and NAT
- Proficiency with SQL
- Knowledge of Linux operating system
- Excellent collaboration and communication skills
- Fluency in English in a B2+ level
Responsibilities:
- Lead the design, deployment, and management of data infrastructure in the cloud, primarily using major cloud platforms such as AWS, Azure, or GCP
- Develop and maintain robust CI/CD pipelines for data infrastructure and applications
- Automate and streamline data-related processes to ensure scalability, reliability, and efficiency
- Ensure the security, availability, and optimal performance of data platforms
- Install, configure, and maintain data tools such as Apache Spark, Apache Kafka, ELK Stack, Apache NiFi, Apache Airflow, or similar tools in both on-premises and cloud environments
- Monitor and troubleshoot data systems, proactively identifying and resolving performance, scalability, and reliability challenges
- Provide mentorship to junior team members
REQUIREMENT SUMMARY
Min:2.0Max:3.0 year(s)
Information Technology/IT
IT Software - Other
Software Engineering
Graduate
Proficient
1
Desde casa, Colombia