Java Full Stack Engineer (remote) at NTT DATA
Plano, Texas, United States -
Full Time


Start Date

Immediate

Expiry Date

21 Jan, 26

Salary

0.0

Posted On

23 Oct, 25

Experience

2 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Java, Spring Boot, Spring Cloud, Spring Batch, Microservices, Angular, HTML5, CSS3, TypeScript, Kafka, REST APIs, COBOL, DB2, SQL Optimization, Google Cloud Platform, Git, Jenkins

Industry

IT Services and IT Consulting

Description
Ensure data integrity for all logistics data. Develop and implement event-driven data pipelines using Kafka for real-time data processing and communication between microservices. Work with GCP services, potentially including Google Kubernetes Engine (GKE) for orchestration, Compute Engine for virtual machines, and Cloud Storage for data. Develop middleware and APIs to bridge the gap between newly developed Java applications and legacy COBOL-based systems. Participate in regular code reviews with other developers to maintain quality standards. Participate in daily stand-ups, sprint planning, and retrospective meetings to track progress and plan future tasks within an Agile/Scrum framework. Collaborate with business analysts, UI/UX designers, and QA testers to define and deliver new features for logistics applications. Guide and mentor junior developers, ensuring that best practices are followed and knowledge is shared across the team. Experience deploying and managing applications on the Google Cloud Platform (GCP) and using its related services, such as GKE and Pub/Sub. Backend: Java, Spring Boot, Spring Cloud, Spring Batch, Microservices Frontend: Angular (v10+), HTML5, CSS3, TypeScript Integration: Kafka, REST APIs, COBOL interfacing Database: DB2, SQL optimization Cloud: Google Cloud Platform (GKE, Cloud Run, Pub/Sub, Cloud SQL) Tools: Git, Jenkins, Docker, Kubernetes, CI/CD pipelines Domain ExpertiseExperience in logistics/supply chain systems - shipment, order, and inventory management GCP or Java certifications are preferred.
Responsibilities
Ensure data integrity for all logistics data and develop event-driven data pipelines using Kafka for real-time processing. Collaborate with various teams to define and deliver new features for logistics applications while mentoring junior developers.
Loading...