Data Engineer at Weekday AI
Bengaluru, karnataka, India -
Full Time


Start Date

Immediate

Expiry Date

31 Mar, 26

Salary

0.0

Posted On

31 Dec, 25

Experience

10 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Data Engineering, Lakehouse Architectures, AWS Cloud Services, Python, SQL, Kafka, Data Pipelines, ETL, Data Governance, Distributed Systems, Data Modeling, Real-Time Processing, Cloud-Native Solutions, Data Quality, Performance Tuning, Event-Driven Architectures

Industry

technology;Information and Internet

Description
This role is for one of the Weekday's clients Min Experience: 10 years Location: Bengaluru JobType: full-time We are looking for a highly experienced Senior Data Engineer to design, build, and scale modern data platforms that power analytics, reporting, and data-driven decision-making. This role requires deep expertise in lakehouse architectures, AWS cloud services, Python, SQL, and Kafka-based streaming systems. You will work closely with data scientists, analytics teams, product stakeholders, and platform engineers to deliver reliable, scalable, and high-performance data solutions. As a senior member of the team, you will play a key role in defining data engineering standards, mentoring engineers, and driving best practices across batch and real-time data pipelines. Key Responsibilities Design, develop, and maintain end-to-end data pipelines for batch and real-time processing using Python, SQL, and Kafka. Architect and implement lakehouse-based data platforms enabling scalable analytics and BI use cases. Build and manage cloud-native data solutions on AWS, leveraging services such as S3, EMR, Glue, Redshift, Lambda, and managed streaming services. Develop high-quality, reusable, and performance-optimized Python and SQL code for data ingestion, transformation, and validation. Implement real-time data streaming and event-driven architectures using Kafka, ensuring low latency and fault tolerance. Optimize data storage, partitioning, and query performance to support large-scale datasets and high concurrency. Collaborate with cross-functional teams to translate business requirements into robust data engineering solutions. Ensure data quality, reliability, security, and governance across all data assets. Lead technical design discussions, conduct code reviews, and mentor junior and mid-level data engineers. Monitor, troubleshoot, and improve system performance, reliability, and cost efficiency. Required Skills & Qualifications 10–19 years of experience in Data Engineering or related roles. Strong hands-on experience with lakehouse architectures and modern data platforms. Extensive experience working with AWS data and analytics services. Advanced proficiency in Python for data engineering and pipeline development. Expert-level SQL skills for complex transformations and performance tuning. Solid experience designing and operating Kafka-based streaming pipelines. Strong understanding of distributed systems, data modeling, and ETL/ELT frameworks. Experience with data governance, security, and best practices in large-scale environments. Nice to Have Experience with big data frameworks (e.g., Spark). Exposure to data observability and monitoring tools. Experience leading or architecting enterprise-scale data platforms.
Responsibilities
Design and maintain end-to-end data pipelines for batch and real-time processing. Collaborate with cross-functional teams to deliver scalable data solutions and ensure data quality and governance.
Loading...