About Nesta

Nesta is an innovation foundation. For us, innovation means turning bold ideas into reality and changing lives for the better. We use our expertise, skills and funding in areas where there are big challenges facing society.

A survey of British social media users found that nearly 43 per cent of those who shared news also admitted to having passed on fake news. The potential of access to unlimited knowledge and vast digital networks of like-minded people seemed limitless in the 1990s. But this promise has soured as technology is increasingly harnessed to sow division and peddle conspiracy theories. Parallel tribes are incentivised to adopt extreme behaviours by the very design of the digital platforms they use, and are becoming increasingly distrustful of each other and the very idea of objective ‘truths’.

According to an analysis by Buzzfeed of the top 20 fake US election stories on Facebook in 2016 each received more reads than the top 20 from mainstream outlets. The issue is similarly wide-spread in the UK, according to one survey, 58% of people had come across news in the past month on social media that they thought was not fully accurate and nearly 43% of those sharing news admitted to having passed on content that was fake.

And, worse, the task of differentiating between authentic content and fabricated material is set to get more difficult as the technological means to create convincing AI-charged deepfakes is mainstreamed. But technology is also being used in the fight back. For example, journalists and fact-checkers are testing the potential for AI itself to be deployed as means of flagging unreliable information.

The knock-on effects of disinformation for democracy, social cohesion and public health (as evidenced by the surge in covid vaccine scepticism) have been well documented. The integrity of democracy depends on voters being able to make informed choices and trusting the outcome of elections. Protecting the economy of truth – the journalistic eco-system that holds the whole edifice – becomes not just a policy priority, but a national security one too.

Our contributors explicitly acknowledge this risk: Elisabeth Braw suggests that just as those on the domestic front in World War II were warned against irresponsible information sharing, we need a modern ‘pre-bunking’ service to protect citizens. Ethan Zuckerman looks ahead to a future where policymakers might get one step ahead of conspiracy theorists by using the potential of AI to simulate the more paranoid corners of the internet. David Halpern sets out an ambitious agenda for proper democratic governance of social media platforms —with rights, processes and user representation properly codified.

Disinformation superhighway was published as part of Minister for the Future, bold new thinking on the long-term issues policymakers can't afford to ignore, in partnership with Prospect. Illustrations by Ian Morris.

Create a psychological defence agency to “prebunk” fake news

Read more

We the users: shifting power from platforms to people

Read more

Stress-test future policies by modelling conspiracy theories before they take hold

Read more