Data Platform Engineer at Fenergo
Dublin, County Dublin, Ireland -
Full Time


Start Date

Immediate

Expiry Date

24 May, 25

Salary

0.0

Posted On

18 Apr, 25

Experience

3 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Good communication skills

Industry

Information Technology/IT

Description

Fenergo exists for one reason and that is to better enable financial institutions to onboard and service their customers digitally, safely, and compliantly. One very simple reason for being. And there are 700 of us at Fenergo who wake up every day thinking about how to improve the customer onboarding experience through technology. And we are the best in the world at it. Which is why we count 32 of the top 50 financial institutions amongst our customers.
It is also why we are consistently ranked as #1 in Customer Lifecycle Management and why we count some of the world’s top companies as our technology partners, Salesforce, IBM, PWC, Accenture, DXC to name but a few. French and UK private equity firms have recently acquired a majority stake in Fenergo, valuing the business at over $1bn, and are looking to scale the business globally. Headquartered in Dublin, Ireland, Fenergo has offices in North America (Boston, New York and Toronto), UK (London), Spain (Madrid), Poland (Wroclaw), Asia Pacific (Sydney, Melbourne, Singapore, Hong Kong and Tokyo) and UAE (Dubai).

Responsibilities

WHAT WILL YOU DO?

As a Data Platform Engineer, you will play a key role in designing, building, and maintaining our cloud-based data lake infrastructure on AWS. You will work closely with a ringfenced team of system engineers, data analysts and a delivery manager to implement best practices for data management and contribute to the overall success of our data platform. This role is ideal for an experienced engineer with a strong technical background who is passionate about working on innovative data projects.

RESPONSIBILITIES

  • Design and Implement Internal Data Lakes;
  • Develop and maintain data lake solutions using AWS service such as Amazon S3, AWS Glue and Amazon Athena
  • Develop Data Pipelines;
  • Build and manage data ingestion and transformation pipelines using tools like AWS Glue, Amazon Kinesis and Apache Spark
  • Support Data Processing;
  • Utilise AWS EMR, AWS Lambda and other services for data processing and transformation
  • Performance and Cost Optimisation;
  • Monitor and optimise data lake performance to ensure cost-effective resource utilisation
  • Compliance and Security;
  • Implement data security measures and ensure the data lake meets regulatory and compliance requirements like AWS IAM and RMS
  • Continuous Learning;
  • Stay updated on emerging cloud and data technologies and apply new insights to improve data lake operations
  • Collaborative Work
  • Work with cross functional teams to troubleshoot and resolve technical issues, providing support for data platform initiatives
Loading...