Senior Developer / Development Lead at nuArch LLC
United States, North Carolina, USA -
Full Time


Start Date

Immediate

Expiry Date

11 Sep, 25

Salary

0.0

Posted On

12 Jun, 25

Experience

0 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Microservices, Docker, Trading Systems, Computer Science, Distributed Systems, Architecture, Kafka, Kubernetes, Stream Processing, Apache Kafka, Data Analytics, Tibco Ems

Industry

Information Technology/IT

Description

We are seeking motivated and proactive Software Engineers with expertise in Java, Kafka / real-time messaging, Apache Flink, and MQ technologies to design, develop, and maintain a foundational data hub. This framework will enable seamless, event-driven communication and data streaming to support near real-time risk reporting, analytics and business reporting.

Qualifications:

  • Bachelor’s degree in Computer Science, Software Engineering, or a related field.
  • Strong Java development experience with expertise in real-time messaging and Kafka.
  • Experience integrating Kafka with trading systems and managing high-volume, low-latency data streams.
  • Proficiency in Apache Flink for stream processing and real-time data analytics.
  • Familiarity with event-driven architecture, distributed systems, and fault tolerance principles.
  • Proficiency with Apache messaging technologies (e.g., Apache ActiveMQ or Apache Kafka) and MQ systems (e.g., IBM MQ, Tibco EMS).
  • Experience with Docker, Kubernetes, and microservices architecture is a plus.
  • Strong understanding of message queuing, reliability, and fault-tolerant systems
Responsibilities

Role Includes:

  • Technical Design, analysis, and support of the foundational data hub across various systems
  • Development and customization of integration tools and solutions, using Kafka, Flink, MQ Series, or other event-driven message transmission systems.
  • Use of integration products to customize or generate solutions that facilitate seamless communication between systems, ensuring high reliability and performance.

Responsibilities:

  • Develop and deploy real-time messaging applications using Kafka.
  • Design / Implement Kafka producers/consumers for high-throughput, low-latency data processing in a trading environment.
  • Integrate Kafka with various trading platforms and financial systems.
  • Troubleshoot Kafka-related issues and optimize performance for high-frequency trading scenarios.
  • Leverage Apache Flink for real-time stream processing, including event-driven data transformations and aggregation.
  • Collaborate with DevOps and SecOps for Kafka and Flink cluster deployment, monitoring, and maintenance.
  • Stay updated on best practices for real-time messaging, Kafka, Apache, Flink, and MQ technologies.
  • Design, develop, and integrate Apache messaging frameworks to ensure high-performance and reliable messaging in critical systems.
  • Work with MQ systems for message queuing, ensuring that data is reliably processed and communicated between distributed systems in real-time.
  • Assist in the migration and integration of Apache Kafka, Flink, MQ, and other messaging solutions as needed.

Qualifications:

  • Bachelor’s degree in Computer Science, Software Engineering, or a related field.
  • Strong Java development experience with expertise in real-time messaging and Kafka.
  • Experience integrating Kafka with trading systems and managing high-volume, low-latency data streams.
  • Proficiency in Apache Flink for stream processing and real-time data analytics.
  • Familiarity with event-driven architecture, distributed systems, and fault tolerance principles.
  • Proficiency with Apache messaging technologies (e.g., Apache ActiveMQ or Apache Kafka) and MQ systems (e.g., IBM MQ, Tibco EMS).
  • Experience with Docker, Kubernetes, and microservices architecture is a plus.
  • Strong understanding of message queuing, reliability, and fault-tolerant systems.

eZykbX06W

Loading...