About Nesta

Nesta is a research and innovation foundation. We apply our deep expertise in applied methods to design, test and scale solutions to some of the biggest challenges of our time, working across the innovation lifecycle.

Five essential insights for public sector managers on AI and public trust

AI is at the heart of government strategy to enhance public service delivery, but its successful adoption requires a clear understanding of public opinion. For public sector organisations, building and maintaining public trust is key to deploying new AI tools responsibly.

Our new polling, carried out in partnership with Opinium, surveyed 2,050 members of the public. The findings highlight five critical areas of concern and support that should inform how public sector organisations approach AI adoption.

1. The public remains more focused on risk than opportunity

A significant portion of the UK public is not yet sold on the potential of AI in the public sector. When forced to choose between two contrasting views:

  • 41% of UK adults are more inclined to believe "AI is dangerous and/or unproven technology and should not be used in the public sector."
  • By contrast, only 29% feel AI can have a transformative effect and should be widely adopted.

This concern is compounded by a lack of trust: just 23% agree that public sector organisations will use AI responsibly.

Simply advocating for the ‘AI opportunity’ is unlikely to change many minds, and may even turn people off. It is critical to demonstrate and communicate how AI has been developed responsibly, and how you will ensure its use minimises risks.

2. Top concerns centre on AI inaccuracy, over-reliance and privacy risks

The public's primary fears about AI focus on fundamental issues of quality and control.

  • 45% of the public named accuracy problems as their biggest concern, with risks to data privacy a close second (43%).
  • However, when looking specifically at the rollout of AI, the number one concern (45% of UK adults) is an over-reliance on AI - a worry that its use could lead to a reduction in human judgement and skill.

Public sector managers must inform the public about the steps taken to address these risks. Our deliberative research shows that people will adjust their levels of concern if they are informed about what has been done. But, be warned, they’ll also probably see through any bulls**t.

3. Support is dependent on the context: using AI in healthcare is supported - but opposed for criminal justice

Public support for AI varies significantly depending on the specific public service.

The public services where AI receives the greatest support are:

  • the NHS/healthcare (38% support vs 31% oppose)
  • transport (37% support vs 25% oppose)
  • education (36% support vs 30% oppose)

A plurality of respondents oppose the use of AI in:

  • policing/criminal justice system (38% oppose vs 28% support)
  • defence (37% oppose vs 28% support)
  • social care (34% oppose vs 29% support)

This demonstrates why a context-specific approach to public engagement is required. Learning more about a specific tool can also help people come to a more nuanced view. For instance, our deliberative research found that 83% felt positively about social workers using a note-taking tool like Magic Notes.

4. The public wants to be involved in the decision-making

A majority of the public feels they must be consulted before AI is introduced in the public sector.

  • 52% of respondents are more inclined to believe a public consultation is required, compared to 20% who believe experts are better equipped to make these decisions.
  • Consultation is deemed 'essential' or at least ‘important’ by a large segment for the NHS/health service (80%), policing/criminal justice (78%), and the pensions and benefits system (75%).

Respondents also want the consultation to be timely: 27% believe it should come while the tool is being tested and another 28% believe it should happen after the trial but before the full rollout. This highlights the need to find the right time to consult the public, who will also ask questions about the evidence, cost, and reallocation of any savings.

5. Public support outweighs cost savings

Public sector procurement teams and finance directors might not like this, but when evaluating new AI tools, the public clearly prioritises a democratic mandate and legitimacy over financial considerations.

Nearly half of respondents (46%) believe public support should be the more important criterion for evaluating AI tools, compared to just 18% who prioritise financial considerations.

This finding underscores the need to demonstrate that genuine engagement has taken place, securing a clear ‘social license’ or mandate before committing to a tool.

Assessing “social readiness”

Nesta's AI Social Readiness advisory label process has been designed to meet these challenges by providing independent social assurance. It uses deliberative polling to involve over 100 representative members of the public in assessing the social acceptability of specific AI tools for public services.

Find out more about two AI tools that have already been through this process:

  • Consult: A tool to help with the analysis of public consultation data.
  • Magic Notes: A tool to help social workers take notes and write reports.

If you’d like to know more, get in touch on [email protected].

Note: We worked with Opinium to poll 2,050 UK adults from their Political Omnibus panel. The sample was weighted to nationally representative criteria. Fieldwork took place 19-21 November 2025.

Author

Kathy Peach

Kathy Peach

Kathy Peach

Director of the Centre for Collective Intelligence Design

The Centre for Collective Intelligence Design explores how human and machine intelligence can be combined to develop innovative solutions to social challenges

View profile
Aleks Berditchevskaia

Aleks Berditchevskaia

Aleks Berditchevskaia

Principal Researcher, Centre for Collective Intelligence Design

Aleks Berditchevskaia is the Principal Researcher at Nesta’s Centre for Collective Intelligence Design.

View profile