Binance Accelerator Program - DevOps Engineer (Kafka)

at  Binance

Remote, Tasmania, Australia -

Start DateExpiry DateSalaryPosted OnExperienceSkillsTelecommuteSponsor Visa
Immediate29 Sep, 2024Not Specified29 Jun, 2024N/AOperating Systems,Communication Skills,Apache Kafka,Distributed Systems,Data Streaming,Messaging,Scripting LanguagesNoNo
Add to Wishlist Apply All Jobs
Required Visa Status:
CitizenGC
US CitizenStudent Visa
H1BCPT
OPTH4 Spouse of H1B
GC Green Card
Employment Type:
Full TimePart Time
PermanentIndependent - 1099
Contract – W2C2H Independent
C2H W2Contract – Corp 2 Corp
Contract to Hire – Corp 2 Corp

Description:

Binance is the global blockchain company behind the world’s largest digital asset exchange by trading volume and users, serving a greater mission to accelerate cryptocurrency adoption and increase the freedom of money.
Are you looking to be a part of the most influential company in the blockchain industry and contribute to the crypto-currency revolution that is changing the world?

JOB SUMMARY

We are looking for a DevOps intern to join our dynamic engineering team. The ideal candidate will have a strong foundational knowledge of Redis, including installation, configuration, and optimization. They will work closely with senior engineers to support our Redis infrastructure and contribute to the development of high-performance applications.

REQUIREMENTS:

  • Basic understanding of Apache Kafka and its components (brokers, topics, producers, consumers).
  • Familiar with distributed systems and real-time data processing concepts.
  • Basic knowledge of Linux operating systems and scripting languages.
  • Understanding of network protocols and data serialization formats (e.g., Avro, JSON).
  • Strong analytical and troubleshooting skills.
  • Good communication skills and ability to collaborate effectively with team members.
  • Eagerness to learn and grow in the field of data streaming and messaging systems.

Responsibilities:

  • Assist in the installation, configuration, and maintenance of Kafka clusters.
  • Monitor Kafka performance metrics and troubleshoot issues.
  • Support the development and maintenance of Kafka-based data pipelines.
  • Work with development teams to integrate Kafka into applications.
  • Help maintain documentation for Kafka configurations and procedures.
  • Participate in debugging and resolving Kafka-related problems.
  • Stay informed about Kafka updates and industry best practices.


REQUIREMENT SUMMARY

Min:N/AMax:5.0 year(s)

Information Technology/IT

IT Software - Other

Software Engineering

Graduate

Proficient

1

Remote, Australia