Lead Software Engineer- Dataplatform at MoEngage Inc
, karnataka, India -
Full Time


Start Date

Immediate

Expiry Date

18 May, 26

Salary

0.0

Posted On

17 Feb, 26

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Java, Python, Kubernetes, Kafka, Trino, Athena, SQL, Prometheus, Grafana, Data Pipelines, Distributed Systems, SDLC, Schema Registry, Airflow

Industry

technology;Information and Internet

Description
Core Responsibilities Architecture & Development: Lead the design and development of scalable data pipelines and robust backend services using Java and Python. Strategic Scalability: Architect high-throughput infrastructure optimized for high availability and sub-second latency. High-Performance Querying: Design distributed ingestion layers and warehouse schemas tailored for Trino and Athena to support petabyte-scale querying. End-to-End Ownership: Drive the full SDLC—from gathering complex requirements to production deployment, monitoring, and incident response. Stakeholder Leadership: Partner with cross-functional teams to translate business needs into technical requirements and resolve systemic platform bottlenecks. Operational Excellence: Maintain a strong bias for action, ensuring the team delivers resilient, high-quality code in a fast-paced environment. Observability: Hands-on experience with Prometheus and Grafana for system monitoring. Technical Requirements (Must-Have) Language Mastery: Expert-level proficiency in Java (Primary) and Python. Distributed Systems: Strong command of Data Structures and Algorithms with a proven track record of optimizing large-scale distributed systems. Orchestration & Infrastructure: Hands-on Kubernetes expertise for managing containerized deployments and resolving complex infrastructure failures. Data & Messaging: Deep experience with SQL, Kafka, and the Confluent Schema Registry. Query Engines: Proficiency with distributed query engines like Athena or Trino. Preferred Qualifications (Bonus) Advanced Querying: Specialized experience in Trino performance tuning. Orchestration: Experience with Airflow for complex workflow management. Analytics: Familiarity with product analytics tools and data modeling.
Responsibilities
The role involves leading the design and development of scalable data pipelines and robust backend services using Java and Python, while architecting high-throughput infrastructure for high availability and low latency. Responsibilities also include driving the full Software Development Life Cycle (SDLC) and partnering with stakeholders to resolve platform bottlenecks.
Loading...