Parameta Solutions - Data Engineer

at  Parameta Solutions

London, England, United Kingdom -

Start DateExpiry DateSalaryPosted OnExperienceSkillsTelecommuteSponsor Visa
Immediate16 Aug, 2024Not Specified17 May, 2024N/AMachine Learning,Snowflake,Aws,Java,Statistics,Amazon Web Services,Python,Data Mining,C++,Kubernetes,Visualisation,Mathematics,Market Data,Communication Skills,Etl,Computer Science,Sql,LinuxNoNo
Add to Wishlist Apply All Jobs
Required Visa Status:
CitizenGC
US CitizenStudent Visa
H1BCPT
OPTH4 Spouse of H1B
GC Green Card
Employment Type:
Full TimePart Time
PermanentIndependent - 1099
Contract – W2C2H Independent
C2H W2Contract – Corp 2 Corp
Contract to Hire – Corp 2 Corp

Description:

The TP ICAP Group is a world leading provider of market infrastructure.
Our purpose is to provide clients with access to global financial and commodities markets, improving price discovery, liquidity, and distribution of data, through responsible and innovative solutions.
Through our people and technology, we connect clients to superior liquidity and data solutions.
The Group is home to a stable of premium brands. Collectively, TP ICAP is the largest interdealer broker in the world by revenue, the number one Energy & Commodities broker in the world, the world’s leading provider of OTC data, and an award winning all-to-all trading platform.
The Group operates from more than 60 offices in 27 countries. We are 5,300 people strong. We work as one to achieve our vision of being the world’s most trusted, innovative, liquidity and data solutions specialist.

EXPERIENCE / COMPETENCES

Essential

  • Bachelor’s degree in computer science, engineering, mathematics, or a related technical discipline.
  • Experience working with Python and SQL, other languages like Java, C# or C++ are also useful.
  • Experience with time-series market data is desirable.
  • Able to write clean, scalable and performant code.
  • Proven written and verbal communication skills including an ability to effectively communicate with both business and technical teams.

Desired

  • Some knowledge of Linux and the command line.
  • Understanding of ETL and event streaming e.g. Kafka.
  • Snowflake, Kubernetes and Airflow experience is desirable.
  • Experience with Amazon Web Services (AWS) would be beneficial.
  • Basic knowledge of data science topics like machine learning, data mining, statistics, and visualisation.

NOT THE PERFECT FIT?

Concerned that you may not meet the criteria precisely? At TP ICAP, we wholeheartedly believe in fostering inclusivity and cultivating a work environment where everyone can flourish, regardless of your personal or professional background. If you are enthusiastic about this role but find that your experience doesn’t align perfectly with every aspect of the job description, we strongly encourage you to apply. You may be the ideal candidate for this position or another opportunity within our organisation. Our dedicated Talent Acquisition team is here to assist you in recognising how your unique skills and abilities can be a valuable contribution. Don’t hesitate to take the leap and explore the possibilities. Your potential is what truly matters to us.

Responsibilities:

ROLE OVERVIEW

We operate a hybrid model where brokers provide business-critical intelligence to clients. It’s supplemented by proprietary screens for historical data, analytics and execution functionality. Globally, we’re the leading provider of proprietary over the counter pricing information and a unique source of data on financial, energy and commodities products. Our market data is independent, unbiased and non-position influenced. Our clients include banks, insurance companies, pension and hedge funds, asset managers, energy producers and refiners as well as risk and compliance managers and charities.
We are looking for a passionate and capable Data Engineer who wants to make a real impact and build software they can be proud of. We work in a collaborative, fast paced environment where you will design, build and deploy new systems and products.

ROLE RESPONSIBILITIES

  • Building performant batch and streaming data pipelines.
  • Building GenAI driven data products.
  • Data warehouse development.
  • Cloud based development.
  • Improving CI/CD Processes.
  • Maintaining data applications, pipelines and databases.
  • Participating in daily stand ups and agile development teams.
  • Writing unit, integration and data quality tests.
  • Contributing to documentation and best practice guidelines.
  • Staying up to date with current technology and techniques.


REQUIREMENT SUMMARY

Min:N/AMax:5.0 year(s)

Information Technology/IT

IT Software - Other

Software Engineering

Graduate

Computer science engineering mathematics or a related technical discipline

Proficient

1

London, United Kingdom