Shadow of the smart machine: Justice, accountability and clarity in the age of algorithms

We rely on increasingly complicated algorithms to make the most of bigger data sets. Algorithms can comb through huge volumes of data and help us find hidden patterns or correlations to help us make better decisions and even make predictions about the future.

Though many of these algorithms and systems have been around in various forms since the 1960s, with larger data sets to learn from and greater processing power (along with advances in the way the algorithms operate), they are becoming more powerful and complex. Machine learning algorithms are able to ‘learn’ on their own and identify patterns never imagined by their human designers. The complexity of many of these systems means the machine’s logic can be practically impossible for a human expert to fully understand.

These advances and the opportunities they create, have seen these systems become invaluable to many businesses. More and more businesses and governments are looking to capture the benefits of these smart machines.

Algorithms and data analysis have actually been a core part of the operation of government for a long time, but the scale, power and complexities of these systems is new. States and public services around the world are turning to these increasingly powerful, complex algorithms in the hope of making better and faster decisions and developing more efficient and effective services. They are being used to predict and tackle crime, help accurately prescribe medicines and even in child services.

But while there are a lot of potential benefits, the growing use of algorithms and learning machines in decisions that affect our lives raises lots of ethical and legal issues. As the systems for analysing data become more complex, they often become less transparent. Who is responsible for decisions that go wrong? How do we identify discrimination in an artificial intelligence? How do we ensure that important information not captured as digital data- our relationships, our schoolchildren’s creativity, etc- are not ignored? How will the relationship between people and the decision-making tools they work with change as the tools become smarter and their internal logic becomes harder to understand?

For governments, which have specific responsibilities for making decisions in a transparent way, these issues present real problems. It’s heartening to know that the UK government is taking these issues very seriously - looking at the potential risks and rewards, what it should be doing to check for bias and discrimination and ways in which it can be a leader in this space. We will need to develop new tools to understand the effects of these systems. Some protections already exist for the general public, as set out in this statement by the ICO, but much is still unclear. Law and regulation still have a way to go before it catches up to this fast changing technology.

Shadow of the Smart Machine is a series of blogs and events bringing together key representatives from government, business, and academics in law, computer science & social sciences to explore the emerging ethical, accountability, and regulatory concerns surrounding algorithm-supported decisions in government, with a special focus on machine learning.

Author

Harry Armstrong

Harry Armstrong

Harry Armstrong

Head of Technology Futures

Harry led Nesta’s futures and emerging technology work.

View profile
Lydia Nicholas

Lydia Nicholas

Lydia Nicholas

Senior Researcher, Explorations

Lydia was a senior researcher in Nesta’s Explorations team, focusing on how minds, systems and technologies come together to perform better at complex challenges with particular focus …

View profile