Emotionally sensitive robots and the art of lying

Machines that appear to be able to interact with us on an emotional level are a thriving but disruptive technology. This is not just about high-tech humanoid teaching assistants – there is a vast array of sensors, interfaces and mechanisms that are built to change our everyday lives in ways that many of us haven’t even begun to think about yet.

At the beginning of July, Nesta held a Hot Topics event that introduced these technologies; we explored the ethical and legal implications of emotionally interactive devices under the guidance of Roger Brownsword, Professor of Law. Gawain Morrison demonstrated his entertainment platform while explaining some of the social issues relevant to this field. After the initial 'wow' subsided, it became apparent that these technologies could have serious implications for society.

No such thing as paranoid androids

It is important to start by being clear about what these robots can and cannot do. Crucially, there is no technology that can detect a human being's emotional state and respond in kind. Robots cannot express emotions as they do not experience them. What these devices can do, however, is take remarkably detailed measurements of our physiology and convert this information into a decision used to execute an action. This action attempts to respond to our emotional state using models of, say, facial expression, as part of the decision-making process. Eerily, a lot of these machines are able to detect minute physiological changes, and infer complex emotional states, more accurately than we can ourselves - how many times have you thought to yourself "I should stop talking now, her galvanic skin response suggests she may be becoming bored"?

Using our feelings for good, not evil

Gawain Morrison measures physiological arousal in concert and cinema audiences using Sensum, a platform that he designed and now uses commercially. The results are interpreted as emotional arousal and engagement with the material - it is a proxy for the impact on the audience. Using this technology, he has analysed viewer responses to television adverts, concluding, for example, that Coca Cola's Christmas advert is 30 seconds longer than it needs to be (confirming what many of us have presumed to be the case all along!). Gawain can measure your fear levels during a horror movie, your interest in a new product or your anxiety at a clip of a man falling off of a tight rope - and all via a small sensor over your finger and battery-pack on your wrist. This data can be used to alter colour palettes and soundtracks to ensure that you experience the media in the way that is most engaging for you, which is an exciting prospect.

There are other exciting applications being developed, tested and used, that I couldn't possibly cover in detail here:

  • Sophisticated satellite navigation systems that detect frustration and exclamations of "I don't want to do that", and respond with a more amenable "okay, let's do it your way".
  • Smart clothing that can tell when you are feeling down and play pre-recorded messages (saved by a loved one at an earlier date) through hood-mounted speakers for only you to hear.
  • Theme-park rides that can measure how much 'thrill' they are providing, as well as detecting nausea.
  • Mindfulness spheres that visualise your heart rate and pulse to the rhythm, helping individuals with mental health problems to focus during meditation.
  • Tactile devices that can mimic the motions of a partnered device on the other side of the world, enabling partners to partake in a physically and temporally synchronous hug.
  • Emotion-mapping software to create a picture of the office environment as a management aid.

So, you may ask, what effect does all this have on society? Beyond the gathering of biometric data from a consenting audience, I don't think we have anything to fear from Gawain. He wants a broader discussion about sharing and storing physiological data collected in this way. While that is a prime issue in the area, it has been, and will continue to be, widely discussed in relation to a gamut of other fields as well as this one. But there should be an ethical debate that runs much deeper than a concern over personal data privacy. What I think we need to start talking about is how this technology may be used in the future, and how we feel about this, preferably before it is imposed on us.

Robo-cops

As Gawain boldly announced in his opening statement - welcome to the physiological age. From virtual reality simulators to smart clothing that plays messages based on your heart rate, we are being exposed to biometric sensors with increasing frequency. Government organisations in some countries may be tracking phone-calls and e-mails, but what happens when they get hold of data about decisions we haven't even consciously made? Perhaps I don't want my facial expression to be tracked whilst shopping, giving away to the analysts in a backroom that I am incredibly anxious about shopping for underwear. Maybe the older generation don't want to be exposed to charity adverts that have been tailored to specifically tug on their heart- (and purse-) strings. But it isn't just marketers that may use this technology as a means to their (less-than-moral) ends; other fields are already showing an interest.

In the USA, a robot is being developed for use by the Border Patrol. It is believed to be the most sophisticated lie-detector in the world, and it will one day be responsible for the investigation of suspected illegal immigrants. It doesn't matter what face you pull, which relaxation techniques you use and how much you maintain 'eye contact' with the screen; a complex system of behavioural analysis will catch you out if you have something to hide. It seems ironic that governments are regularly criticised for being out of touch with the nation's feelings, yet we may soon be campaigning to keep it that way.

Truth machines

Do we have the right to lie? Humans mislead each other all the time - indeed, so do animals. Imagine a robot companion who is so attuned to your heart rate that when the (as-of-yet blissfully unaware) love of your life enters the room, it immediately starts beeping and suggesting relaxation techniques that you may wish to try. In addition to embarrassment, our metallic friends could thwart our white lies and hollow promises, causing offense and upset.

It doesn't take much thought to realise that we use our right to misrepresent, or at least supress, emotions with startling frequency. Jealousy, disappointment and anger are hard enough to hide already, without robots calling us out on them. That said, this technology has some wonderful applications and may soon be enhancing our enjoyment of films, allowing the design of tailored theme-park experiences, and helping all of us to understand and deal with our emotions more responsibly. Personally, I don't know quite how I feel about all this; perhaps there's a robot somewhere that could help me work it out

Author

Gemma King