About Nesta

Nesta is an innovation foundation. For us, innovation means turning bold ideas into reality and changing lives for the better. We use our expertise, skills and funding in areas where there are big challenges facing society.

How does the public feel about AI analysing UK government consultation responses?

Nesta’s Centre for Collective Intelligence Design, in collaboration with the UK government’s Incubator for Artificial Intelligence, has trialled an approach to involve the public in assessing AI tools for public services.

Governments around the world are embracing the use of AI tools to drive efficiencies and improvements in the delivery of public services, from healthcare to education. In the UK, the AI Opportunities Action Plan and Blueprint for Modern Digital Government outline the ambition for public sector responsible uptake and use of AI tools. The UK public, however, is ambivalent about whether these tools will have a positive impact on society, with views often highly context-specific - depending on the tool, the organisation using it and what it will do.

The AI Social Readiness Advisory process offers a solution: independent social assurance, gauging UK public opinion on specific AI tools being considered for use in public services. We have designed this process and the AI Social Readiness Advisory Label to support public sector staff as they make decisions about AI tool procurement, deployment and risk management. The results should be used alongside other information, including technical evaluations and compliance processes.

Our first pilot was with the UK government’s Incubator for Artificial Intelligence, a technical team within Government Digital Service (part of the Department for Science, Innovation and Technology) that prototypes and develops AI tools for modern digital government.

They developed ‘Consult’, an AI tool which aims to make the process of analysing responses to public consultations faster and more insightful. The tool uses AI to extract patterns and themes from the responses, giving policy makers control of which themes to use before turning them into interactive dashboards for policy makers. This means humans are in control and freed to do the work of drawing meaningful insights from those patterns.

In May and June, we ran 18 deliberative polling sessions involving a total of 144 members of the public, capturing their views on the Consult tool. The result is an ‘AI Social Readiness Advisory Label’ for the Consult tool – a summary of public views on benefits and risks, as well as recommendations for future development and deployment of the tool. A full report accompanies each Label.

Headlines: what did the public think about Consult?

Overall, people agreed that using the Consult tool will benefit the consultation process. Most people were comfortable with this use of AI because it focuses on a small part of the consultation process, has human oversight and doesn’t make decisions.

  • All of the potential benefits of the Consult tool were recognised as important. People valued that the tool could help UK government departments run consultations more efficiently, but they also wanted to know what the cost- and time-savings would be used for and what ongoing development and maintenance costs would be.
  • People have their own ideas for how AI tools can improve public consultations. Ideas suggested by the public included using AI to enhance inclusion by translating responses from different languages or by interpreting handwritten and audio responses.
  • Participants were much less concerned about the risks posed by Consult compared to risks from AI tools in general. But people weren’t satisfied with the current mitigations described for two tool-specific risks: model manipulation and environmental impact, and wanted to know more about how the Consult team would manage these.
  • People had lingering concerns that this use of AI could have longer-term impacts on the democratic process. They discussed whether adopting the tool could lead to changes in the number and diversity of responses to government consultations. They also worried that using an AI model developed by a US technology company might lead to political bias, escalating costs or undue influence.
  • Most people believe that human oversight is crucial for safely using the Consult tool. They suggested that more sensitive and local issues should require a higher degree of human checking to ensure that the Consult tool didn’t miss critical information.

What next?

The Magic Notes tool, developed by social enterprise Beam, will be the next AI tool to go through the AI Social Readiness Assessment process in autumn 2025.

If you have an AI tool intended for use, or being used, in UK public services and you would like it to go through the AI Social Readiness Advisory process, please get in touch by emailing [email protected].

Authors

Aleks Berditchevskaia

Aleks Berditchevskaia

Aleks Berditchevskaia

Principal Researcher, Centre for Collective Intelligence Design

Aleks Berditchevskaia is the Principal Researcher at Nesta’s Centre for Collective Intelligence Design.

View profile
Kathy Peach

Kathy Peach

Kathy Peach

Director of the Centre for Collective Intelligence Design

The Centre for Collective Intelligence Design explores how human and machine intelligence can be combined to develop innovative solutions to social challenges

View profile
Esther Moss

Esther Moss

Esther Moss

Product Design Lead, Centre for Collective Intelligence Design

Esther is the product design lead for the Centre for Collective Intelligence Design at Nesta, harnessing the power of collective intelligence to design digital products & services.

View profile