Data Engineer at Capco Singapore
Pune, maharashtra, India -
Full Time


Start Date

Immediate

Expiry Date

12 Mar, 26

Salary

0.0

Posted On

12 Dec, 25

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Python, PySpark, Hadoop, Cloudera, Airflow, Data Warehouse, Data Lake, Big Data, Spark, Scala, Java, Oracle, Netezza, SQL, Agile, Scrum, Nifi

Industry

Financial Services

Description
About Us “Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO? You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. MAKE AN IMPACT Hands-on expertise in Python, PySpark, Hadoop, Cloudera platforms, Airflow • Base Skill Requirements: • MUST Technical • Experience in Data Warehouse/Data Lake/Lake House related projects in product or service-based organization • Expertise in Data Engineering and implementing multiple end-to-end DW projects in Big Data environment handling petabyte scale data. • Solid Experience of building complex data pipelines through Spark with Scala/Python/Java on Hadoop or Object storage • Experience of working with Databases like Oracle, Netezza and have strong SQL knowledge. • Proficient in working within an Agile/Scrum framework, including creating user stories with well-defined acceptance criteria, participating in sprint planning and reviews • Optional Technical • Experience of building Nifi pipelines (Preferred) • Strong analytical skills required for debugging production issues, providing root cause and implementing mitigation plan • Strong communication skills - both verbal and written • Ability to multi-task across multiple projects, interface with external / internal resources • Proactive, detail-oriented and able to function under pressure in an independent environment along with a high degree of initiative and self-motivation to drive results • Willingness to quickly learn and implement new technologies, participate POC to explore best solution for the problem statement • Experience working diverse and geographically distributed project teams

How To Apply:

Incase you would like to apply to this job directly from the source, please click here

Responsibilities
The Data Engineer will work on engaging projects with major financial institutions, focusing on transforming the financial services industry. They will be responsible for building complex data pipelines and implementing data engineering solutions.
Loading...