Start Date
Immediate
Expiry Date
14 Jun, 25
Salary
0.0
Posted On
14 Mar, 25
Experience
0 year(s) or above
Remote Job
Yes
Telecommute
Yes
Sponsor Visa
No
Skills
Good communication skills
Industry
Information Technology/IT
We are looking for a self-motivated individual with an innovative mind, a passion for technology, and a good understanding of Data Modeling.
Our client is a digital-first technology services firm specializing in accelerating business transformation and delivering human-centric digital experiences. We have been meeting customers wherever they are in the digital lifecycle and helping them outperform their competition through speed and innovation.
They bring together distinct core competencies – in AI, analytics, app development, cloud, commerce, CX, data, DevOps, IoT, mobile, quality engineering and UX, and our deep expertise in BFSI, healthcare, and life sciences – to help businesses capitalize on the unlimited opportunities digital offers. Our reputation is built on a comprehensive suite of engineering services, a dedication to solving clients’ toughest technology problems, and a commitment to continuous improvement.
The Python Data Engineer will focus on designing and optimizing data pipelines, integrating data from diverse sources, and ensuring the smooth flow of structured and unstructured data for audit and risk analytics. This role requires strong Python programming skills, expertise in data wrangling, and experience working with cloud platforms.
Key Responsibilities:
Develop scalable and efficient data pipelines to support audit processes.
Use Python for data extraction, transformation, and loading (ETL) from various data sources.
Implement data quality checks and validation processes.
Collaborate with business stakeholders and auditors to understand data requirements.
Leverage cloud platforms (AWS, Azure, GCP) to store and process data.
Document workflows, pipelines, and transformation logic for transparency.
Key Skills & Experience:
Strong hands-on experience in Python (Pandas, NumPy, PySpark).
Experience building ETL/ELT processes.
Familiarity with cloud platforms (AWS, Azure, GCP) and big data technologies (e.g., Snowflake, Databricks).
Understanding of data governance and regulatory compliance.
Ability to work in a fast-paced, regulated environment
Please refer the Job description for details