Data Engineer - SQL/PySpark/ Python / Reporting tool and Cloud ( 5 yrs - Im at WNS Global Services
Noida, Uttar Pradesh, India -
Full Time


Start Date

Immediate

Expiry Date

14 Jun, 26

Salary

0.0

Posted On

16 Mar, 26

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Sql, Pyspark, Python, Cloud, Azure, Aws, Gcp, Data Engineering, Data Architecture, Data Governance, Data Modelling, Spark, Excel, Data Integration, Automation, Stakeholder Management

Industry

Business Consulting and Services

Description
Company Description WNS (Holdings) Limited (NYSE: WNS), is a leading Business Process Management (BPM) company. We combine our deep industry knowledge with technology and analytics expertise to co-create innovative, digital-led transformational solutions with clients across 10 industries. We enable businesses in Travel, Insurance, Banking and Financial Services, Manufacturing, Retail and Consumer Packaged Goods, Shipping and Logistics, Healthcare, and Utilities to re-imagine their digital future and transform their outcomes with operational excellence.We deliver an entire spectrum of BPM services in finance and accounting, procurement, customer interaction services and human resources leveraging collaborative models that are tailored to address the unique business challenges of each client. We co-create and execute the future vision of 400+ clients with the help of our 44,000+ employees. Job Description Seeking a skilled and experienced Data Engineer with over 5 years of expertise in designing, developing, and implementing data-driven solutions. The ideal candidate should strong expertise in architecting and building scalable data solutions using cloud platforms(Azure/AWS/GCP), SQL and PySpark. involves architecting, designing, and optimizing scalable data platforms and pipelines to support enterprise data initiatives. Financial domain knowledge(Order to Cash process) and ERP system integration experience is desirable.Job Responsibilities.Collaborate with business stakeholders to understand requirements and translate them into technical specifications Troubleshoot and resolve issues related to data pipelines, data quality, SQL queries, and Python/Pyspark scripts. Support data platform architecture, data modelling standards, and best practices for data governance and security. Strong hands-on experience with SQL, Python and Spark. Expert knowledge on schematic layer transformations Experience 5+ years of hands-on experience in data engineering, data architecture, and data governance. Strong hands-on expertise with cloud platforms (AWS/Azure/GCP) and on-premise implementations Advanced proficiency in SQL (optimization, stored procedures) and Python for data processing. Financial domain Knowledge (Order to Cash process) and ERP system integration experience is desirable.Must Have Should be well versed with Python, Spark, SQL and Excel.Cloud/On-premise Experience with data integrations, mapping and automation.Outstanding communications and stakeholder management skills. Qualifications Graduate or above Additional Information Uk Shifts
Responsibilities
The role involves architecting, designing, and optimizing scalable data platforms and pipelines to support enterprise data initiatives, while collaborating with stakeholders to translate requirements into technical specifications. Responsibilities also include troubleshooting data pipeline issues, data quality, and managing SQL queries and Python/PySpark scripts.
Loading...