Biometrics and Privacy: Are we asking the right questions?
New biometric measurements like retina scans, voice prints and gait analysis might redefine our understanding of privacy
Biometrics and Privacy: Are we asking the right questions?
The vulnerability of passwords is reiterated almost daily at the moment. The Heartbleed bug highlighting the fragility of a system based on passwords. Yet alternatives like biometrics are still met with scepticism and concern by the public. Are these concerns rooted in genuine threat or has science fiction taken the lead in influencing perceptions of biometrics and their potential risks?
Passwords or Biometrics?
To get a better understanding of this, we transformed the registration process for this Nesta event into a mini experiment. At registration, attendees were asked to answer the following question:
If you had to choose between using a biometric measurement (e.g. fingerprint, retina scan) or a password to protect a private account, which one would you prefer and why?
The results showed an even split between the two; out of the 87 who answered the question, 39 chose biometrics, 38 chose passwords, while 10 suggested that a combination of both be used.
Out of the 39 who chose biometrics, 20 made reference to specific measurements: 8 mentioned fingerprints, 8 retina scans and 4 voice prints.
Those choosing biometrics justified their choice by arguing that biometric measurements are more convenient, they can’t be forgotten, and they are perceived to be more reliable and secure than passwords. (Some made it explicit that this is their personal perception, and were unsure of the validity of security standards.)
Most of those who chose passwords explained their choice by highlighting the limitations of biometrics. Some of the most common ones mentioned were that biometrics are too invasive, they can’t be renewed or transferred and that the technology is not mature enough. The main concerns were about potential bio-hacking and theft risks (e.g. chopped fingers, swapped eyeballs), as well as the lack of transparency and trust in how and where biometric information is stored and subsequently used.
Science Fiction is not a guide to real life security systems
I have yet to be in a conversation about biometrics where the film Minority Report has not been referenced (and I feel the irony of being the one bringing it up in this post). These fictional biometrics are something that a lot of people relate to.
Dr Peter Waggett, Programme Leader in IBM’s Emerging Technology Group, attempted to demystify biometrics by confronting some of the film’s inaccuracies. For example, intrusive retina scans have been replaced by iris scans which are imperceptible to the person being scanned, and eye-swapping is not likely to work.
Similarly, Dr John Bustard, Lecturer at Queen's University Belfast, highlighted how his work on biometric spoofing - attempting to artificially replicate biometric information, has led to more secure and robust systems. For example, rubber fingerprints or severed fingers would not work on new biometric systems because these not only recognise the fingerprint, but also the temperature of the finger, pulse and distinct chemicals produced by the skin of the owner.
Both speakers drew attention to the wide variety of biometrics available, out of which fingerprints and irises are only the beginning. They emphasised the increasing adoption of behavioural biometrics such as gait or typing recognition.
Matthew Silverstone, CEO of Facebanx - face, voice and ID document recognition software - also highlighted the increasing use of these biometrics in many industries such as banking and online gaming, as measures to prevent fraud.
Discussions around biometrics need to move from fictional examples and be more transparent about what biometric measurements are, their accuracy, uses and genuine threats for users.
Privacy is dead. We have killed it. But its shadow still looms.
“Privacy” has become a repository for a range of issues about crossing personal boundaries in the digital age. The term has been stretched to encompass everything from data protection to identity theft. In this process, the concept of privacy has become void of a concrete meaning.
Dr Waggett argued that the relevant question is now not whether biometric information is collected about individuals, but what can be done with this information. Prof Juliet Lodge, member of the Privacy Expert Group of the Biometrics Institute, pushed for greater transparency in the data collection process. This would ultimately inform and empower users, giving them greater control over their biometric data.
We need to have a new kind of conversation
Maybe it is time to take a step back from focusing on the technology and look at what concepts like privacy, identity, human dignity and freedom mean in today’s digital world where personal information has become a widely accepted form of currency. How does our understanding of these concepts shape our attitude towards technologies like biometrics?
Organisations like the Biometrics Institute, led by Isabelle Moeller, are opening up spaces for discussion between key partners in the field, from biometrics security corporations, to researchers and policy makers. However, more practical, open discussions need to be had and more constructive questions need to be asked.
How can we create a system that embraces biometrics to the benefit of its citizens? How do we construct consistent regulation and an ethical code to restore trust in resilient technologies and the credibility of government to maintain this resilience? How can we integrate biometric information with the wider security system to respond to defensive narratives around these technologies? Can we use biometrics to actually increase confidence in privacy protection?