Research & Innovation Fellow in Safety of ML
at University of York
University of York, England, United Kingdom -
Start Date | Expiry Date | Salary | Posted On | Experience | Skills | Telecommute | Sponsor Visa |
---|---|---|---|---|---|---|---|
Immediate | 29 Jul, 2024 | GBP 54395 Annual | 04 May, 2024 | N/A | Machine Learning,Computer Science | No | No |
Required Visa Status:
Citizen | GC |
US Citizen | Student Visa |
H1B | CPT |
OPT | H4 Spouse of H1B |
GC Green Card |
Employment Type:
Full Time | Part Time |
Permanent | Independent - 1099 |
Contract – W2 | C2H Independent |
C2H W2 | Contract – Corp 2 Corp |
Contract to Hire – Corp 2 Corp |
Description:
DEPARTMENT
You will join an exciting new research centre in the Department of Computer Science at the University of York. The Centre for Assuring Autonomy (CfAA) is building on the work of the Assuring Autonomy International Programme (AAIP) which pioneered approaches to assuring autonomous systems and their machine learning (ML) components. The CfAA also contributes to the assurance pillar of the Institute for Safe Autonomy (ISA). You will be based at the University, in ISA, and spend time at the Robotics and AI Collaboration (RAICo) research facility in West Cumbria, giving you direct access to systems, engineers and researchers.
SKILLS, EXPERIENCE & QUALIFICATION NEEDED
You must have a first degree in Computer Science or cognate discipline and a PhD in computer science, autonomous systems, or equivalent experience. You should have knowledge of machine learning and ideally some knowledge of safety assurance and safety cases. Experience of developing machine learning models is desirable. You must have experience of undertaking high quality research and a proven ability to take responsibility for a research project.
Interview date: week commencing 3 June 2024
For informal enquiries: please contact Dr. Richard Hawkins – richard.hawkins@york.ac.uk
The University strives to be diverse and inclusive – a place where we can ALL be ourselves.
We particularly encourage applications from people who identify as Black, Asian or from a Minority Ethnic background, who are underrepresented at the University.
We also encourage applications from women for senior roles.
We offer family friendly, flexible working arrangements, with forums and inclusive facilities to support our staff. #EqualityatYork
Responsibilities:
You will be researching techniques for assuring the safety of ML used in safety-related applications in challenging environments. Building on previous research undertaken in the AAIP such as AMLAS (https://www.york.ac.uk/assuring-autonomy/guidance/amlas/), you will develop methods that lead to the creation of ML components that can be demonstrated to be sufficiently safe to deploy. You will also explore how assurance of ML can be sustained through-life as the systems evolve and the environment changes. This post provides a unique opportunity to develop and validate these techniques on real autonomous robotic systems being used for nuclear decommissioning and cleanup tasks. This role will require you to work closely and effectively with other team members including experienced engineers and researchers and be able to explain your research clearly and precisely to a range of different audiences.
In this role you will initially be seconded until March 2025 to work on real robotics projects at the RAICo research facility in Cumbria. Relocation expenses can be provided to support this where appropriate.
REQUIREMENT SUMMARY
Min:N/AMax:5.0 year(s)
Information Technology/IT
Pharma / Biotech / Healthcare / Medical / R&D
Software Engineering
Graduate
Computer science autonomous systems or equivalent experience
Proficient
1
University of York, United Kingdom