Data Engineer (Python) at Willis Limited, trading as Willis Towers Watson plc 
Gurgaon, haryana, India -
Full Time


Start Date

Immediate

Expiry Date

18 May, 26

Salary

0.0

Posted On

17 Feb, 26

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Python, LLMs, API Integration, Data Analysis, Pandas, FastAPI, Flask, Azure, CI/CD, Unit Tests, Prompt Engineering, Data Pipelines, Cloud Deployments, Data Transformation, Problem-Solving, Agile

Industry

Financial Services

Description
About the Team: Neuron is a fast-growing digital initiative within WTW, revolutionizing risk placement and trading in the global insurance market. We are seeking a Data Engineer with a passion for hands-on coding, data analysis, and intelligent API integration. The role will support the design and delivery of data-driven products with a particular focus on leveraging Python, LLMs, and cloud-native technologies. About The Role: • Integrate with internal/external LLM APIs (e.g., OpenAI, Azure OpenAI), including prompt engineering and pre/post-processing as required. • Build and maintain data analysis workflows using Pandas for data transformation and insight delivery. • Develop RESTful APIs using FastAPI or Flask for data and document management. • Design and implement clean, efficient, and modular Python codebases for backend services, data pipelines, and document processing workflows. • Support the team in onboarding new data sources, integrating with Azure services, and ensuring smooth cloud deployments. • Collaborate with product, data science, and engineering teams to translate business requirements into technical solutions. • Write unit tests and contribute to CI/CD pipelines for robust, production-ready code. • Stay up to date with advances in Python, LLM, and cloud technologies. The Requirements: • Exposure to LLM integration (prompt design, API integration, handling text data). • Strong experience in Python with focus on data analysis (Pandas) and scripting. • Hands-on experience in building REST APIs (FastAPI or Flask). • Experience in developing data pipelines, data cleaning, and transformation. • Working knowledge of Azure cloud services (Azure Functions, Blob Storage, App Service, etc.) • (Nice to have) Experience integrating MongoDB with Python for data storage, modelling, or reporting. Skills: • Strong problem-solving and quantitative skills. • Impactful written and verbal communication. • Ability to work both independently and collaboratively in an agile, fast-paced team. • Business acumen and eagerness to learn new technologies. • Self-driven and able to manage tasks and priorities effectively. Nice-to-Have: • Experience with MongoDB modelling and aggregation pipelines. • Familiarity with Datadog or other observability/monitoring tools. • Exposure to document management workflows (upload, versioning, tagging). • Insurance or financial services experience.

How To Apply:

Incase you would like to apply to this job directly from the source, please click here

Responsibilities
The role involves integrating with LLM APIs, engineering prompts, and building data analysis workflows using Pandas for transformation and insight delivery. Responsibilities also include developing RESTful APIs using FastAPI or Flask and designing clean, modular Python codebases for backend services and data pipelines.
Loading...