About Nesta

Nesta is an innovation foundation. For us, innovation means turning bold ideas into reality and changing lives for the better. We use our expertise, skills and funding in areas where there are big challenges facing society.

Making the case for AI in policing

At Nesta we have made the argument that some of the biggest challenges facing UK public sector today can be tackled, or at least assisted, by making better use of data; and that doesn’t stop at policing.

Very recently, we drew on the insights of the growing field of Collective Intelligence to explore how law enforcement can change their approach to gathering and acting upon intelligence and we’ve continued to follow with interest some of the existing AI initiatives forces are adopting.

Having explored these, I have noticed an increasing number of publications criticising police use of data and AI. However, drawing on my experiences in practical experimentation with data, combined with a background in policing, I’m asking myself why we are seeing such a disparaging attitude towards forces’ attempts to innovate. Of course, due to the role and responsibilities of policing in society, it is important that this is approached with caution and consideration. However, to routinely discredit forces efforts and dismiss out of hand the potential benefits of such practices because of the technologies’ misuse elsewhere seems unhelpful. By no means am I disregarding the ethical challenges put forward, and I appreciate that the threat of reinforcing bias through use of records of past arrests and outcomes is very real, but is that a good enough reason to not try it at all?

It’s time to start thinking about policing differently and begin championing more innovative use of data in policing for 3 key reasons;

1. It’s a moment in time - an opportunity to embrace new ways of working, with ethics at the core;

Despite one report calling for Police forces in the UK to end entirely their use of predictive mapping programs and individual risk assessment programs, AI in policing is growing and shows no signs of let up; according to Deloitte, more than half of UK police forces plan to invest in AI by 2020. Being at the start of this journey means there is ample opportunity for ethical frameworks and regulation, which require forces to make reasonable steps to eliminate bias from tools, to be developed up front.

However, although under-reported amongst the negativity, in many cases this is already happening. For example, ALGO-CARE™’ is a proposed decision-making framework for the deployment of algorithmic assessment tools in policing which has been developed in collaboration with Durham Constabulary, showing how ethical considerations, such as the public good and moral principles can be factored into the early deployment decision-making processes.

Code of ethical behaviour shop front

In another example, despite recent criticism, the National Data Analytics Solution (NDAS) project being run by the West Midlands Police includes a number of steps to ensure the programme is developed as ethically as possible. At the technical level, anything ethnicity related has been excluded, and in order to help reduce geographical bias’, they are also considering stripping out anything to do with geography. In addition, NDAS is also consulting with various independent ethics groups and external experts and in 2018 proactively approached the Alan Turing Institute Data Ethics Group (ATI DEG) to invite a review of the high level strategy; to understand how a suitable ethical framework could be applied to the approach.

2. Data driven policing is not about eliminating human responsibility;

Machines will not replace officer knowledge and discretion and we have seen no examples of activity based solely on the outcome of an algorithm, they are merely a tool to aid decision making rather than replace it. Of course, ethics must drive police practice and technology must never be used to punish individuals based on the outcome of an algorithm, but AI has huge potential to aid the human decision making process.

For example, in order to predict the likelihood of reoffending, (despite recent criticism) Durham Constabulary’s Harm Assessment Risk Tool (HART) utilises machine learning-based statistical methods to be able to take account of the vast amount of offender data held by police forces; far more information than a human custody officer could ever process, yet the officer retains ultimate responsibility for deciding what further action should be taken on the basis of the assessment.

3. In a push for policing to keep up, forces are being actively encouraged to make greater use of their data and embrace machine learning:

With rising demand and increasing pressures on the front line, Police forces have to look for different ways of doing things. Despite this being something they already know, it’s echoed time and time again by both Government and others, with some even commenting that forces are bringing about the changes to meet current and future demand too slowly and too modestly.

In fact, after highlighting forces’ lack of long-term planning and fragmented use of technology in previous reports, Her Majesty’s Inspectorate of Constabulary and Fire & Rescue Services (HMICFRS) Chief Inspector Tom Windsor directly instructed forces to embrace AI in the latest State of Policing Report;

“With the enormous capability and potential of AI, the police could and must do much more. The opportunity here is not only to get machines to do faster what the police already do. It is also to use technology to achieve police objectives in ways we have not even thought of yet, and might never.”

Of course, if machine learning is not up to scratch, the answer to keep up with demand cannot be this alone, but the clear direction from the top is to, at the very least, experiment.

Amongst the haze, I hope that in the coming months we will begin to see some more positive stories emerging of successful uses of AI in policing. I hope that we will begin to see examples of AI enabled insights informing decisions that have reduced threat, protected the vulnerable and even saved lives. We know we must approach with caution and consideration, but we can embrace with optimism and an open mind too.

Author

Michelle Eaton

Michelle Eaton

Michelle Eaton

Programme Manager

Michelle worked in the Government Innovation team on how the smarter use of data and technology can help civil society and public sector organisations deliver services, better.

View profile