Big Data Lead at HEXAWARE
, , India -
Full Time


Start Date

Immediate

Expiry Date

22 Jun, 26

Salary

0.0

Posted On

24 Mar, 26

Experience

10 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Snowflake, DBT, Python, Data Modeling, Database Design, Data Quality, Data Governance, AWS, Glue, EMR, Lambda, Snowpipe, Data Vault 2.0, DevOps, CI/CD, ETL

Industry

IT Services and IT Consulting

Description
Role : Sr. Data Engineer (Snowflake + DBT + Python) Location: Chennai Primary Skills: Snowflake, DBT & Python Secondary Skill : Experience / good knowledge of data modeling, database design, data quality and governance principles. AWS skills - Glue/EMR, Lambda Responsibilities: Design and implement data pipelines using Snowpipe for data ingestion. Design and implement data pipelines using DBT for transformation and loading processes into Snowflake Data Vault 2.0. Execute ETL processes using DBT to load the data from Snowflake Business Data Vault to Snowflake Consumption Layer, ensuring the smooth implementation of data integration strategies. Demonstrate a good understanding and hands-on experience in DevOps, actively participating in troubleshooting issues within the data operations environment. Participate in data quality checks and monitoring, implementing appropriate metrics and alerts. Work with the team to automate data pipelines and deploy code using CI/CD tools. Monitor and optimize data platform performance and resource utilization. Document data pipelines and processes for maintainability and knowledge sharing. Preferred skills: Glue/EMR , Lambda and Fivetran Qualifications: Overall 10+ years of experience on data engineering Demonstrated experience of 5+ years in primary skill with similar engineering role, showcasing proficiency in Data Transformations and Integration Projects Solid understanding and hands-on experience in the Azure Cloud environment, particularly DBT, Python and Snowflake Proficient in troubleshooting data pipelines built using DBT & Snowflake. Good experience in ETL processes, and data integration. Ability to work collaboratively in a team-oriented environment Excellent problem-solving and analytical skills.
Responsibilities
The role involves designing and implementing data pipelines using Snowpipe for ingestion and DBT for transformation and loading processes into Snowflake Data Vault 2.0. Responsibilities also include executing ETL processes, participating in data quality checks, and automating pipelines using CI/CD tools.
Loading...