Sunday, February 2, 2025

Nottingham academics receive funding to help make AI safe to use

Academics from the University of Nottingham have received funding to research how to ensure Artificial Intelligence (AI) is safe to use, in a new project alongside the Universities of Oxford and Warwick.

The UKRI funded Responsible AI UK (RAI UK) has announced its first round of Impact Acceleration projects, investing £1.8m to ensure AI is safe, and that it will be responsible to the needs of society, addressing topics such as Generative AI in teaching and learning.

In one of these Impact Acceleration projects, researchers from Horizon Digital Economy Research and the Mixed Reality Lab at the University of Nottingham will join colleagues from the University of Oxford and the University of Warwick to deliver ‘Responsible Innovation Advantage in Knowledge Exchange ‘RAKE’’.

The project will work with a variety of stakeholders including businesses, standards bodies, funding organisations, research teams, doctoral training centres and SMEs to explore how Responsible Innovation (RI) can be better embedded and will deliver RI training sessions in these differing environments.

Dr Alan Chamberlain, in the School of Computer Science at the University of Nottingham, will lead on the development of an international interdisciplinary network to bring together academics, researchers and experts to better understand responsible AI in the context of the Arts and Humanities.

He said: “Starting to examine how people understand and apply responsible AI in their work is important. It will help us to approach the design of responsible systems together. Involving the public is fundamental, it’s vitally important to enable people to participate and have input into research.”

Professor Elvira Perez Vallejos and Dr Virginia Portillo at the University of Nottingham, along with Dr Carolyn Ten Holter at the University of Oxford, have a wealth of expertise and knowledge and will lead work with the Institute of Electrical and Electronics Engineers Standards Association, UKRI CDTs and AI spinouts to map current RI practice, identify gaps and develop guidance to embed and drive knowledge exchange.

A message from the Editor:

Thank you for reading this story on our news site - please take a moment to read this important message:

As you know, our aim is to bring you, the reader, an editorially led news site and magazine but journalism costs money and we rely on advertising, print and digital revenues to help to support them.

With the Covid-19 pandemic having a major impact on our industry as a whole, the advertising revenues we normally receive, which helps us cover the cost of our journalists and this website, have been drastically affected.

As such we need your help. If you can support our news sites/magazines with either a small donation of even £1, or a subscription to our magazine, which costs just £33.60 per year, (inc p&P and mailed direct to your door) your generosity will help us weather the storm and continue in our quest to deliver quality journalism.

As a subscriber, you will have unlimited access to our web site and magazine. You'll also be offered VIP invitations to our events, preferential rates to all our awards and get access to exclusive newsletters and content.

Just click here to subscribe and in the meantime may I wish you the very best.









Latest news

Related news

By continuing to use the site, you agree to the use of cookies. more information

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.

Close