Data Engineer, Ticker - Remote first at Ticker Ltd
England, England, United Kingdom -
Full Time


Start Date

Immediate

Expiry Date

08 Dec, 25

Salary

55000.0

Posted On

09 Sep, 25

Experience

0 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Aws, Craft, Automation, Data Science, Metrics, Amazon Web Services, Production Systems, Low Latency, Telematics, Sql, Spark, It, Ml, Python, Azure

Industry

Information Technology/IT

Description

WHO ARE WE:

Ticker has built a next-generation motor insurer, leveraging connected car data and data science across every aspect of the business, including pricing, underwriting, customer engagement and claims.
The motor insurance landscape is set for a major shift and the battleground will be won or lost in the next five years. It will all come down to pricing sophistication and the most efficient use of machine learning and artificial intelligence across the whole value chain.
The business includes a first-class executive team with a proven track-record in building successful insurtechs and running some the largest insurers in the UK.

SKILLS YOU’LL NEED:

  • Databricks using Spark with Python. We have a mix of data sources in our domain and you’ll be comfortable in picking the right tool for the job to get our data to where it needs to be. Ideally exposure to running data workloads on AWS or Azure, but ETL patterns and practices will be applicable to how you keep your data marts up to date.
  • Knowledge of Amazon Web Services (AWS) – all our production systems are built within AWS, and our Databricks platform is self-hosted in AWS, being fed with a collection of AWS hosted source systems, including DynamoDB, RDS and event driven integrations via EventBridge, SNS, and S3.
  • SQL –you can hand craft SQL queries with the best of them and understand how to optimise the underlying logical layers for improved performance as needed.
  • Motivated and self-directed – you’ll have a drive to move our data platform forward strategically, while still meeting daily requirements and tactical needs. It’s a fast-paced environment, and you’ll be responsible for taking your ideas through to production. We build it, we run it.
  • Passion for automation – you’ll make robots for the mundane tasks so you can focus on using your time in future for delivering even more value. You will have experience working in a DevOps culture and willing to drive it, contributing to CI/CD platform.
  • An inquiring mind that needs to prove the accuracy of the numbers you’re presenting. You’ll be building confidence every day that the analysis of our data can be trusted, by both our stakeholders and customers.

BONUS POINTS - SKILLS THAT’LL GIVE YOU AN EDGE:

  • Streaming analytics and low latency reporting – Event Driven Analytics. We’ve got terabytes of telematics and operational quote data to be analysed with low latency.
  • A passion for running the world by the numbers – Metrics and KPIs to really challenge our assumptions and given confidence in the observability of our systems and their intended outcomes.
  • An interest in Data Science, ML and AI – understanding how we leverage our Data Lake with our Data Scientists.

How To Apply:

Incase you would like to apply to this job directly from the source, please click here

Responsibilities

WHAT THE ROLE IS:

We’re looking for an experienced Data Engineer to help us build our data analytics platform on Databricks. You’ll be working in a modern, cloud-native environment, working on the delivery of data analytics, with a particular focus on building the data pipelines and managed data platforms on AWS. We use the best cloud-native, serverless technologies to help us focus on delivering features and value, rather than infrastructure. Our tech will both delight our customers and not be noticed if we’re doing our job right. You’ll be an essential part of putting data at people’s fingertips fast. It’s a fast paced, and always evolving environment, but you’ll be up for the challenge and happy to be taking the lead.

WHAT YOU’LL BE DOING WITH YOUR DAYS:

  • Designing, building and shipping the data pipelines and processing capabilities in our Data Lake as the foundation for our company’s demanding Data Science and MI requirements – we’re up and running with Databricks in our AWS environment
  • Working within our Engineering squads to ensure that accurate and timely use of data continues to be at the heart of our company
  • Support and empower the use of our operational databases – you’ll help ensure our business always has answers at their fingertips.
  • Designing, Modelling and Developing across our data stack to ensure the quality of our ETL processes, automated reporting and self-service analytics.
  • Automating our Databricks AWS environment with DevOps principles and ensure safe progression of features into Production.
  • Developing an in depth understanding of the data within the organisation and ensuring accurate use and interpretation across the business consistently
  • Performance tuning and optimisation – we LOVE getting the best bang for our buck. Getting the most efficient configuration of our data platforms through automation and proactive monitoring.
  • Embracing the right tools for the job
  • Be enthusiastic about learning new things to help us go faster and stay lean – we’ll support your ongoing learning with access to online training, books etc.
Loading...