Start Date
Immediate
Expiry Date
28 Apr, 25
Salary
0.0
Posted On
29 Jan, 25
Experience
0 year(s) or above
Remote Job
No
Telecommute
No
Sponsor Visa
No
Skills
Hive, Powercenter, Hadoop, Kafka
Industry
Information Technology/IT
DESCRIPTION:
• Minimum 5+ years of development and design experience in Informatica Big Data Management
• Extensive knowledge on Oozie scheduling, HQL, Hive, HDFS (including usage of storage controllers) and data partitioning.
• Extensive experience working with SQL and NoSQL databases.
• Linux OS configuration and use, including shell scripting.
• Good hands-on experience with design patterns and their implementation.
• Well versed with Agile, DevOps and CI/CD principles (GitHub, Jenkins etc.), and actively involved in solving, troubleshooting issues in distributed services ecosystem.
• Familiar with Distributed services resiliency and monitoring in a production environment.
• Experience in designing, building, testing, and implementing security systems – including identifying security design gaps in existing and proposed architectures and recommend changes or enhancements.
• Responsible for adhering to established policies, following best practices, developing, and possessing an in-depth understanding of exploits and vulnerabilities, resolving issues by taking the appropriate corrective action.
• Knowledge on security controls designing Source and Data Transfers including CRON, ETLs, and JDBC-ODBC scripts.
• Understand basics of Networking including DNS, Proxy, ACL, Policy, and troubleshooting
• High level knowledge of compliance and regulatory requirements of data including but not limited to encryption, anonymization, data integrity, policy control features in large scale infrastructures.
• Understand data sensitivity in terms of logging, events and in memory data storage– such as no card numbers or personally identifiable data in logs.
• Implements wrapper solutions for new/existing components with no/minimal security controls to ensure compliance to bank standards.