Skip to content

Prediction and learning – why we need more predictions, including wrong ones

It’s common in futures work to emphasise that any comments on the future are not predictions. They aim to enlighten; to provoke; and to ‘disturb the present’. The future is too uncertain to predict.  There’s a lot of sense in that – and our main goal with FutureFest is to stimulate, to make people’s brains fire faster, not to offer a forecast. But there is a remarkable virtue in more precise prediction, however difficult it is to do well. In many fields of human life trying to predict the results of your actions, and then learning from what actually happens, is the root of intelligence. It’s how we learn to drive a car, play a musical instrument or shoot a bow and arrow.

Some algorithms are (sometimes) better predictors than the average professional

The spread of predictive algorithms has made prediction more precise. Healthcare has been using algorithms for decades; so have the police and criminal justice systems. Often they have found that the algorithms are better predictors than the average professional. In a very different field, Philip Tetlock’s work on expert political judgement was even more damning of the experts’ ability to predict.

My proposal would be not to eliminate prediction but to make it much more explicit, and to make it part of how professions and experts learn, and are held to account.   This is beginning to happen as teachers predict children’s exam grades. Doctors can predict risks of patients’ return to hospital. Governments can offer predictions of the impacts of their policies at the same time as they get laws passed and budgets agreed. Business leaders can predict how well their new strategy will go. Journalists can predict what they think will happen in forthcoming elections.

Learning as the crucial missing element

The aim should not be to encourage a culture of blame. The world will never follow predictions precisely. But being held accountable for what you have learned about why your prediction didn’t come true contributes to much more intelligent debate and more intelligent systems. The key is to have systematic processes of reflection on what actually happens. This learning is the missing element. Journalists are the obvious example of this but there are many others.

Many of the best minds have got things badly wrong.

The journalist Anatole Kaletsky authoritatively told the world in 2007/8 that the financial crisis would be short and mild. The Nobel Prize winning economist Paul Krugman said that the internet would be no more important for economies than the fax machine. The futurist Peter Schwartz boldly predicted, just ahead of the financial crash, that the world was on the verge of an unprecedented period of faster growth that would last for several decades. Ray Kurzweil is notorious for having got many predictions wildly wrong alongside some accurate ones. My friend Will Hutton forecast in the mid-90s that the UK economy was going down the tube just as we were beginning the longest period of growth in our history. Figures like John Gray repeatedly predict disasters while others, like Matt Ridley, repeatedly predict triumphs, neither much bothered by inconvenient reality. And I’ve lost count of the number of writers who have predicted China’s imminent collapse over the last 15 years (but am aware of very few who then explained why they were wrong).

Of course I’ve made my fair share of mistakes too. I predicted that there would be a revolution in Saudi Arabia in the early 1980s; that every UK household would have a fibre connection by 2000; and that the Conservatives would win a big majority in 2010. Ten years ago I thought that both Ireland and Iceland would be models for the world, rather than exemplars of hubris and disaster, and I’ve repeatedly over-estimated the pace of change. But I would like to think that when my predictions didn’t materialise I changed my mental model of how the world works and made it more accurate.

Reducing the stupidity of public life

So I would like a world where many more people in positions of prestige are encouraged to make specific predictions, and then given visible opportunities to comment on what they’ve learned when the world moves in surprising ways. No-one should be taken seriously who doesn’t do this. Various people at different times have proposed repositories of forecasts to encourage this. The idea can also be built into the normal work of professions, with requirements to make regular and explicit predictions. It would be extremely uncomfortable for the powerful and influential. But it would also be immensely useful, a great leveller, and possibly the single most important step in reducing the stupidity of public life.

Our weekend event FutureFest, 14-15 March at Vinopolis London, features radical speakers and presentations and explores what the world might be like in decades to come. Browse the line-up and book your ticket.

Part of
FutureFest

Author

Geoff Mulgan

Geoff Mulgan

Geoff Mulgan

Chief Executive Officer

Geoff Mulgan has been Chief Executive of Nesta since 2011. Nesta is the UK's innovation foundation and runs a wide range of activities in investment, practical innovation and research.

View profile