Senior Data Engineer at Weekday AI
Tiruchirappalli, tamil nadu, India -
Full Time


Start Date

Immediate

Expiry Date

06 Jan, 26

Salary

2500000.0

Posted On

08 Oct, 25

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Python, GCP, Airflow, Google BigQuery, Distributed Systems, Data Pipeline Design, SQL, DBT, Spark, Kafka, Kinesis, Docker, Kubernetes, Data Modeling, Cloud Platforms, Automation, Agile Development

Industry

technology;Information and Internet

Description
This role is for one of the Weekday's clients Salary range: Rs 2300000 - Rs 2500000 (ie INR 23-25 LPA) Min Experience: 5 years Location: Chennai, Trichy JobType: full-time We are seeking a Senior Data Engineer with a strong foundation in software development, distributed systems, and large-scale data processing. The ideal candidate is passionate about clean, efficient code and thrives in building reliable, scalable data pipelines that empower data-driven decision-making. Key Responsibilities Design, develop, and maintain scalable data pipelines and distributed systems using tools like Airflow, Spark, Kafka, Kinesis, and BigQuery. Apply advanced programming expertise (in Python or similar languages) to build efficient, maintainable, and high-performance data workflows. Develop and optimize complex SQL and DBT queries for large-scale data processing and analytics. Work with semantic layers and data modeling tools such as LookerML and Kube to enable self-service analytics. Leverage cloud platforms (preferably Google Cloud Platform or Snowflake) for scalable and cost-efficient data solutions. Utilize container technologies such as Docker and Kubernetes for data infrastructure deployment and orchestration. Collaborate with cross-functional teams to ensure data reliability, quality, and performance. Drive automation of data-driven processes, workflows, and decision-making systems. Follow agile development practices and contribute to continuous improvement in data engineering standards. Stay updated with emerging technologies, including AI-assisted tools such as Cursor AI and GitHub Copilot, to enhance productivity and innovation. Required Skills & Experience 5+ years of programming experience in Python or related languages. 2+ years of hands-on experience in distributed systems and data engineering frameworks (Airflow, Spark, Kafka ecosystem, or Kinesis). Strong expertise in SQL, DBT, and data modeling. Experience with BigQuery, Snowflake, or other modern data warehouses. Solid understanding of cloud environments (preferably GCP). Familiarity with Docker, Kubernetes, and container-based deployments. Knowledge of ad-serving technologies and data standards is a plus. Excellent communication and collaboration skills with a problem-solving mindset. Core Skills GCP Airflow Google BigQuery Python Distributed Systems Data Pipeline Design
Responsibilities
Design, develop, and maintain scalable data pipelines and distributed systems. Collaborate with cross-functional teams to ensure data reliability, quality, and performance.
Loading...