Binance Accelerator Program - DevOps Engineer (Kafka)
at Binance
Remote, Tasmania, Australia -
Start Date | Expiry Date | Salary | Posted On | Experience | Skills | Telecommute | Sponsor Visa |
---|---|---|---|---|---|---|---|
Immediate | 29 Sep, 2024 | Not Specified | 30 Jun, 2024 | N/A | Operating Systems,Scripting Languages,Messaging,Data Streaming,Apache Kafka,Distributed Systems,Communication Skills | No | No |
Required Visa Status:
Citizen | GC |
US Citizen | Student Visa |
H1B | CPT |
OPT | H4 Spouse of H1B |
GC Green Card |
Employment Type:
Full Time | Part Time |
Permanent | Independent - 1099 |
Contract – W2 | C2H Independent |
C2H W2 | Contract – Corp 2 Corp |
Contract to Hire – Corp 2 Corp |
Description:
Binance is the global blockchain company behind the world’s largest digital asset exchange by trading volume and users, serving a greater mission to accelerate cryptocurrency adoption and increase the freedom of money.
Are you looking to be a part of the most influential company in the blockchain industry and contribute to the crypto-currency revolution that is changing the world?
JOB SUMMARY
We are looking for a DevOps intern to join our dynamic engineering team. The ideal candidate will have a strong foundational knowledge of Redis, including installation, configuration, and optimization. They will work closely with senior engineers to support our Redis infrastructure and contribute to the development of high-performance applications.
REQUIREMENTS:
- Basic understanding of Apache Kafka and its components (brokers, topics, producers, consumers).
- Familiar with distributed systems and real-time data processing concepts.
- Basic knowledge of Linux operating systems and scripting languages.
- Understanding of network protocols and data serialization formats (e.g., Avro, JSON).
- Strong analytical and troubleshooting skills.
- Good communication skills and ability to collaborate effectively with team members.
- Eagerness to learn and grow in the field of data streaming and messaging systems.
Responsibilities:
- Assist in the installation, configuration, and maintenance of Kafka clusters.
- Monitor Kafka performance metrics and troubleshoot issues.
- Support the development and maintenance of Kafka-based data pipelines.
- Work with development teams to integrate Kafka into applications.
- Help maintain documentation for Kafka configurations and procedures.
- Participate in debugging and resolving Kafka-related problems.
- Stay informed about Kafka updates and industry best practices.
REQUIREMENT SUMMARY
Min:N/AMax:5.0 year(s)
Information Technology/IT
IT Software - Other
Software Engineering
Graduate
Proficient
1
Remote, Australia