Senior Data Engineer, Aladdin Quantitative Engineering, Vice President

at  BlackRock

Budapest, Közép-Magyarország, Hungary -

Start DateExpiry DateSalaryPosted OnExperienceSkillsTelecommuteSponsor Visa
Immediate06 Aug, 2024Not Specified09 May, 2024N/AEtl,Due Diligence,Azure,User Requirements,Data Models,Operational Readiness,Python,Data Curation,Data Migration,Snowflake,Maintenance,Ownership,Cloud Services,Data Infrastructure,Load,Design,Google Cloud Platform,User Experience,It,Subject Matter ExpertsNoNo
Add to Wishlist Apply All Jobs
Required Visa Status:
CitizenGC
US CitizenStudent Visa
H1BCPT
OPTH4 Spouse of H1B
GC Green Card
Employment Type:
Full TimePart Time
PermanentIndependent - 1099
Contract – W2C2H Independent
C2H W2Contract – Corp 2 Corp
Contract to Hire – Corp 2 Corp

Description:

Description
About this role
About this role
Our Quantitative Aladdin Engineering team (part of Aladdin Engineering) is a diverse and distributed team with a keen interest and expertise in all things related to technology and financial analytics. We are responsible for the research and development of quantitative financial models and tools across many different areas – single-security pricing, prepayment models, risk, return attribution, optimization and portfolio construction, scenario analysis and simulations– and covering all asset classes. We are also responsible for the technology platform that delivers those models to our internal partners and external clients, and their integration with Aladdin.
Job Purpose / Background
Our team is looking for self-motivated data engineer to contribute to our scalable data platform. The data platform is compelling analytics offering, supported by high quality historical data. This encompasses both individual security as well as index constituent levels and portfolio context. This historical data includes indicative information, prices, analytics, exposures, index-related and other fields. This data is fully quality-controlled and can be integrated with client custom data. This data set will be used by working with modelers for research and by clients for analytics. To design, implement and maintain this platform, we are looking for an individual with experience and interest in Python/Scala, and Snowflake for data warehousing and computation.
Key Responsibilities

We expect the role to involve the following core responsibilities and would expect a successful candidate to be able to demonstrate skills or experience with the following (not in order of priority):

  • Data modelling expertise: understanding design, maintenance, and ownership of data Infrastructure and being able to design efficient data models tailored to Snowflake’s architecture is key.
  • ETL (extract, transform, load) process knowledge: the developer needs to manage data migration into Snowflake. This includes extracting data from various sources, transforming it into a usable form, and loading it into the Snowflake platform.
  • Cloud computing skills: as Snowflake is a cloud-native platform, familiarity with cloud services, particularly AWS, Azure, or Google Cloud Platform, is necessary.
  • Performance tuning abilities: the developer should be adept at tuning Snowflake settings to balance performance and cost.
  • Working with the subject matter experts and modelers to understand the business and their requirements. Help determine the optimal data model and structure to deliver on those user requirements.
  • Understanding the data and setup and monitor the QC / Surveillance.
  • Implementation of and maintenance of a standard data / technology deployment workflow to ensure that all deliverables/enhancements are delivered in a disciplined and robust manner.
  • Build high quality software that improves the user experience of the downstream data developer.
  • Ensure operational readiness of the product and meet customer commitments with regards to incident SLAs.

Skillset

  • Strong experience in Python is must.
  • In depth knowledge of ETL, data curation and analytical jobs using distributed computing framework.
  • Experience working with large data sets with hands-on technology skills to design and build robust ingestion pipelines using industry standard frameworks.
  • Good knowledge of database system internals.
  • Strong aptitude for designing data models and building tools for data due diligence and data extraction pipeline.
  • Java / Scala knowledge is a plus.
  • DevOps experience is a plus.

Our benefits
To help you stay energized, engaged and inspired, we offer a wide range of employee benefits including: retirement investment and tools designed to help you in building a sound financial future; access to education reimbursement; comprehensive resources to support your physical health and emotional well-being; family support programs; and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about.
Our hybrid work model
BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock.
About BlackRock
At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children’s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress.
This mission would not be possible without our smartest investment – the one we make in our employees. It’s why we’re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive.

Responsibilities:

  • Data modelling expertise: understanding design, maintenance, and ownership of data Infrastructure and being able to design efficient data models tailored to Snowflake’s architecture is key.
  • ETL (extract, transform, load) process knowledge: the developer needs to manage data migration into Snowflake. This includes extracting data from various sources, transforming it into a usable form, and loading it into the Snowflake platform.
  • Cloud computing skills: as Snowflake is a cloud-native platform, familiarity with cloud services, particularly AWS, Azure, or Google Cloud Platform, is necessary.
  • Performance tuning abilities: the developer should be adept at tuning Snowflake settings to balance performance and cost.
  • Working with the subject matter experts and modelers to understand the business and their requirements. Help determine the optimal data model and structure to deliver on those user requirements.
  • Understanding the data and setup and monitor the QC / Surveillance.
  • Implementation of and maintenance of a standard data / technology deployment workflow to ensure that all deliverables/enhancements are delivered in a disciplined and robust manner.
  • Build high quality software that improves the user experience of the downstream data developer.
  • Ensure operational readiness of the product and meet customer commitments with regards to incident SLAs


REQUIREMENT SUMMARY

Min:N/AMax:5.0 year(s)

Information Technology/IT

IT Software - DBA / Datawarehousing

Software Engineering

Graduate

Proficient

1

Budapest, Hungary