Senior Data Engineer_Group Technology Office_GIP_Hybrid/Full Remote

at  HypoVereinsbank UniCredit

București, Municipiul București, Romania -

Start DateExpiry DateSalaryPosted OnExperienceSkillsTelecommuteSponsor Visa
Immediate05 Aug, 2024Not Specified08 May, 202415 year(s) or aboveJava,Data Engineering,Relational Databases,Unit Testing,Etl Tools,Pycharm,Kafka,Performance Tuning,Exception Management,Mathematics,Cloudera,Python,Eclipse,Mysql,Computer ScienceNoNo
Add to Wishlist Apply All Jobs
Required Visa Status:
CitizenGC
US CitizenStudent Visa
H1BCPT
OPTH4 Spouse of H1B
GC Green Card
Employment Type:
Full TimePart Time
PermanentIndependent - 1099
Contract – W2C2H Independent
C2H W2Contract – Corp 2 Corp
Contract to Hire – Corp 2 Corp

Description:

UniCredit is embarking on a journey to modernize its business application and its underlying technology by taking a platform centric approach. In the next couple of years, UniCredit is investing in building technology platforms at a group level. One such platform is the group integration platform that will help the business quickly and easily integrate legacy systems with modern, cloud-native platforms, help reduce time to market and costs. This platform is based on an event driven architecture that will have built-in security, monitoring, and observability functions. Additionally, it will allow business users to easily build customized business journeys dynamically across all cloud and legacy platforms without coding and track the performance of those journeys in real time.
UniCredit wants to create a team of experienced hands-on software engineers and architects that can accelerate the build of this platform. We are looking for various skilled and experienced engineers who have deep understanding in how kernels of an operating systems are build.
The integration platform is the kernel of the group technology platforms the UniCredit is investing in. It is the core platform in the technology platform ecosystem of UniCredit that will act as a bridge between the experience platform and the data platform.
As a Senior data engineer, you will be responsible for designing, developing, and configuration-based data pipeline services for the group integration platform. You will work with a team of data engineers, data analysts, and data architects to deliver data solutions that enable data-driven decision making and innovation. You will also mentor and coach junior data engineers and ensure the quality and reliability of the data products.

QUALIFICATIONS:

  • Bachelor’s degree or higher in computer science, engineering, mathematics, or a related field.

REQUIREMENTS:

  • At least 15 years of experience in data engineering, preferably in the banking or financial sector
  • Expertise in Azure cloud platform or Google Cloud Platform services and related technologies
  • Excellent programming skills in Python and Java – Performance tuning, handling large data volume, batch/stream processing, logging frameworks, exception management, and boilerplate creation
  • Proficiency in frameworks such as Spark Batch, Spark Streaming, and Flink
  • Experience working on Big Data platforms like Databricks, Google DataProc, Azure HDInsight, or Cloudera
  • Experience working on developer interfaces like IntelliJ, PyCharm, or Eclipse
  • Experience working on Big Data platforms like Databricks, Google DataProc, Azure HDInsight, or Cloudera
  • Proficiency in data engineering tools like Azure Data Factory, Google Composer (AirFlow) or any ETL tools
  • Experience in working with relational databases like MySQL and Postgres
  • Experience in working with MPP/OLAP/Analytical databases like Azure Synapse, Google BigQuery, Databricks Delta table
  • Experience in working with streaming data using Kafka, Azure EventHub, or Google PubSubMPP/OLAP/Analytical databases like Azure Synapse, Google BigQuery, Databricks Delta tables
  • Experience with code promotion, automated unit testing, code execution monitoring, and deployment

Responsibilities:

  • Design and implement scalable, secure, and efficient data pipelines and platforms using cloud technologies and best practices.
  • Develop configuration-based data pipelines to Integrate data from various sources, including legacy systems, external APIs, and streaming data sources.
  • Lead efforts to extract, transform, and load (ETL) data from legacy systems into cloud-based environments, ensuring data quality and integrity throughout the process.
  • Optimize data quality, performance, and reliability using data engineering techniques and tools.
  • Mentor and coach junior data engineers and foster a culture of data excellence and innovation.
  • Optimize the pipeline code for speed, memory usage and resource efficiency. Identify bottlenecks and areas of improvements.
  • Build pipeline code that will address for concurrency and parallel execution of pipeline tasks.
  • Utilize data security and encryption principles to execute the pipelines.


REQUIREMENT SUMMARY

Min:15.0Max:20.0 year(s)

Information Technology/IT

IT Software - DBA / Datawarehousing

Software Engineering

Graduate

Computer science engineering mathematics or a related field

Proficient

1

București, Romania