Data Engineer (PySpark & Informatica BDM) at GSSTech Group
Bengaluru, karnataka, India -
Full Time


Start Date

Immediate

Expiry Date

05 Jul, 26

Salary

0.0

Posted On

06 Apr, 26

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

PySpark, Informatica BDM, Data Engineering, SQL, Data Modeling, ETL/ELT, Big Data Processing, Data Pipelines, Risk & Compliance, Trade Finance, Agile, Cloud Environments, Distributed Data Processing, Problem-solving, Analytical Skills

Industry

IT Services and IT Consulting

Description
We are seeking a skilled Data Engineer with strong expertise in PySpark and Informatica BDM to support Risk & Compliance platforms as part of a large-scale Trade Transformation program. The role involves working closely with the Data Engineering team and Data Lead to assess platform impacts, deliver scalable solutions, and ensure high-quality data engineering outcomes. Key Responsibilities: Collaborate with the Data Engineering team to analyze and assess impacts on Risk & Compliance platforms resulting from transformation initiatives. Translate business and impact assessments into actionable data engineering and development tasks. Design, develop, and implement robust data solutions aligned with enterprise architecture and standards. Develop and optimize data pipelines using PySpark and Informatica BDM. Ensure timely delivery of development tasks while maintaining high quality and adherence to SLAs. Work closely with cross-functional teams including business, risk, and compliance stakeholders. Leverage modern tools and technologies, including AI-assisted development tools (e.g., Claude), to enhance productivity and efficiency. Participate in code reviews, testing, and deployment activities. Required Skills: Strong hands-on experience in PySpark and big data processing. Expertise in Informatica Big Data Management (BDM). Good understanding of data engineering concepts, ETL/ELT processes, and data pipelines. Experience working in banking, financial services, or Risk & Compliance domains. Strong SQL and data modeling skills. Familiarity with distributed data processing frameworks and cloud environments is a plus. Excellent problem-solving and analytical skills. Preferred Qualifications: Experience in Trade Finance or transformation programs. Exposure to AI-assisted development tools. Ability to work in agile environments and manage multiple priorities.
Responsibilities
The role involves collaborating with the data engineering team to assess platform impacts and deliver scalable data solutions for Risk & Compliance. You will be responsible for developing and optimizing data pipelines using PySpark and Informatica BDM while ensuring high-quality delivery.
Loading...