Data Engineer at Weekday AI
Bengaluru, karnataka, India -
Full Time


Start Date

Immediate

Expiry Date

30 Mar, 26

Salary

0.0

Posted On

30 Dec, 25

Experience

10 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

AWS, Data Engineering, Lakehouse Architectures, Python, SQL, Data Modeling, ETL, Data Governance, Data Pipelines, Analytics, Data Quality, Data Warehousing, Cloud-native Architectures, Performance Tuning, Cost Optimization, Mentorship, Automation

Industry

technology;Information and Internet

Description
This role is for one of the Weekday's clients Min Experience: 10 years Location: Bengaluru JobType: full-time We are seeking a highly experienced Data Engineer with 10–20 years of industry experience to design, build, and scale modern data platforms that power analytics, reporting, and data-driven decision-making across the organization. This role is ideal for a hands-on technical leader who has deep expertise in AWS-based data ecosystems, lakehouse architectures, and advanced Python and SQL development. As a senior member of the data engineering team, you will be responsible for architecting resilient, high-performance data pipelines, enabling reliable data access at scale, and guiding best practices across data modeling, ingestion, processing, and governance. Key Responsibilities Design, develop, and maintain scalable data pipelines and data platforms using AWS services such as S3, Glue, EMR, Redshift, Athena, Lambda, and related ecosystem tools. Architect and implement lakehouse solutions, enabling unified data storage for structured and unstructured data with support for analytics and machine learning workloads. Build robust ETL/ELT pipelines using Python and SQL, ensuring high data quality, accuracy, and performance. Optimize data models, queries, and storage formats to support large-scale analytical workloads and low-latency access. Collaborate with analytics, data science, and business teams to translate data requirements into scalable technical solutions. Implement data governance, security, and compliance best practices, including access controls, encryption, and monitoring. Lead performance tuning, cost optimization, and reliability improvements across AWS data infrastructure. Provide technical leadership and mentorship to junior and mid-level data engineers, setting standards for code quality, testing, and documentation. Drive adoption of modern data engineering practices, including automation, observability, and CI/CD for data workflows. Participate in architectural reviews and contribute to long-term data platform strategy and roadmap. Required Skills & Experience 10–20 years of experience in data engineering, data platform development, or large-scale data systems. Strong hands-on expertise with AWS data services and cloud-native data architectures. Proven experience designing and implementing lakehouse architectures. Advanced proficiency in Python for data processing, orchestration, and automation. Expert-level SQL skills, including query optimization and complex analytical queries. Solid understanding of data modeling, schema design, and data warehousing concepts. Experience working with large datasets, distributed systems, and high-volume data pipelines. Strong problem-solving skills with the ability to work independently on complex technical challenges. Nice to Have Experience with streaming or near-real-time data pipelines. Exposure to data governance, metadata management, or data quality frameworks. Prior experience in mentoring teams or leading large-scale data initiatives.
Responsibilities
Design, develop, and maintain scalable data pipelines and platforms using AWS services. Collaborate with analytics and data science teams to translate data requirements into technical solutions.
Loading...