Azure Data Engineer at EXL Talent Acquisition Team
, , India -
Full Time


Start Date

Immediate

Expiry Date

30 Apr, 26

Salary

0.0

Posted On

30 Jan, 26

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Azure Data Factory, Snowflake, DBT, SQL, ETL/ELT Pipelines, Data Integration Workflows, Data Quality, Data Governance, Azure Cloud Platform, SnowSQL, Data Warehousing, Cloud-Native Environment

Industry

Business Consulting and Services

Description
EXL Service 10 Exchange Place, Suite 2200, Jersey City, NJ T: +1.201.748.4700 www.exlservice.com Role: Azure Data Engineer Location: All EXL Location Work Mode: Hybrid Key Responsibilities: • Design and develop ETL/ELT pipelines using Azure Data Factory, Snowflake, and DBT. • Build and maintain data integration workflows from various data sources to Snowflake. • Write efficient and optimized SQL queries for data extraction and transformation. • Work with stakeholders to understand business requirements and translate them into technical solutions. • Monitor, troubleshoot, and optimize data pipelines for performance and reliability. • Maintain and enforce data quality, governance, and documentation standards. • Collaborate with data analysts, architects, and DevOps teams in a cloud-native environment. Must-Have Skills: • Strong experience with Azure Cloud Platform services. • Proven expertise in Azure Data Factory (ADF) for orchestrating and automating data pipelines. • Proficiency in SQL for data analysis and transformation. • Hands-on experience with Snowflake and SnowSQL for data warehousing. • Practical knowledge of DBT (Data Build Tool) for transforming data in the warehouse. • Experience working in cloud-based data environments with large-scale datasets. Good-to-Have Skills: • Experience with Azure Data Lake, Azure Synapse, or Azure Functions. • Familiarity with Python or PySpark for custom data transformations. • Understanding of CI/CD pipelines and DevOps for data workflows. • Exposure to data governance, metadata management, or data catalog tools. • Knowledge of business intelligence tools (e.g., Power BI, Tableau) is a plus. Qualifications: • Bachelor’s or master’s degree in computer science, Data Engineering, Information Systems, or a related field. • 5+ years of experience in data engineering roles using Azure and Snowflake. Key Skills: Azure, Snowflake, SQL, Data Factory, DBT
Responsibilities
The role involves designing and developing ETL/ELT pipelines utilizing Azure Data Factory, Snowflake, and DBT, while also building and maintaining data integration workflows from various sources into Snowflake. Responsibilities include writing optimized SQL queries, understanding business requirements, and monitoring/troubleshooting data pipelines for performance and reliability.
Loading...