ML Engineer at Sarvam AI
Bengaluru, karnataka, India -
Full Time


Start Date

Immediate

Expiry Date

30 Mar, 26

Salary

0.0

Posted On

30 Dec, 25

Experience

2 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Python, Async Programming, FastAPI, gRPC, PostgreSQL, Redis, LLM API Integration, Docker, Kubernetes, GCP, AWS, Azure

Industry

Software Development

Description
About Sarvam.ai Sarvam.ai is at the forefront of India’s AI revolution, dedicated to building transformative technologies that empower users and redefine digital interactions. Our mission is to develop intelligent systems that seamlessly integrate into everyday workflows, enhancing productivity and user experience. About the Role We are looking for a Senior Software Engineer with 3-5 years of experience to join our team building an enterprise-grade AI orchestration platform. You will work on our agentic execution layer that integrates large language models (LLMs), third-party tools, and task graph execution to enable complex AI-driven workflows. What You'll Work On Design and implement distributed microservices for AI agent orchestration Build and optimize task graph execution engines for LLM-powered workflows Develop integrations with multiple LLM providers and tool ecosystems Create and maintain gRPC/REST APIs for real-time AI interactions Implement document processing pipelines (ingestion, chunking, vector embeddings) Work on knowledge base systems with vector search capabilities Build code interpreter sandboxes for safe AI code execution Contribute to MCP (Model Context Protocol) server implementations Required Skills Must Have Python - 3+ years production experience, async programming (asyncio), type hints, Pydantic Backend - FastAPI or similar frameworks, gRPC & Protobuf, microservices patterns Databases - PostgreSQL (SQLAlchemy, migrations), Redis (caching, pub/sub, distributed locks) AI/ML - LLM API integration (OpenAI, Google AI, Anthropic), understanding of embeddings & RAG Infrastructure - Docker, Kubernetes basics, observability (logging, tracing) Cloud - GCP, AWS, or Azure experience Nice to Have Vector databases (Milvus, Pinecone) Workflow orchestration (Temporal, Celery) Document processing (PDF parsing, OCR) Bazel build system MCP (Model Context Protocol)
Responsibilities
You will design and implement distributed microservices for AI agent orchestration and build and optimize task graph execution engines for LLM-powered workflows. Additionally, you will develop integrations with multiple LLM providers and create real-time APIs for AI interactions.
Loading...