Image of Data stream

Big data

Is there an Uncanny Valley for data?

Louise Marston - 16.05.2012

As I go about scoping some new research on big data, open data and some of the opportunities and challenges for innovation, I've been wondering if there's an equivalent to the Uncanny Valley of robotics that governs how comfortable we feel with the data that we share.

As part of this programme of work, I'm interested in the barriers that stop people using, sharing and innovating with the data they already have and can access. One of the major things that holds people back are concerns with privacy, data protection, and companies concerns about overreaching with data and alienating their customers.


Analogy of the uncanny valley

In robotics, there is an idea of the 'Uncanny Valley', which was created by Japanese roboticist, Masahiro Mori. I learned about this idea from an Economist article, but you can also read this interview in Wired with Masahiro Mori.

Mori Uncanny Valley

By Smurrayinchester [CC-BY-SA-3.0 (http://creativecommons.org/licenses/by-sa/3.0/)], via Wikimedia Commons

The idea is that we identify, and feel comfortable with, quite crude representations of humans, like cartoons and stuffed animals. As robots become more human-like (say, from the Roomba to Asimo), we get more comfortable, until the point where it seems 'uncanny' and we treat them as weird and alien.

Bunraku puppet

A Bunraku puppet, an uncanny example from Mori's original chart [from Wikimedia].

I've been wondering if there's a similar effect in play with data and privacy. We like to receive customised discounts on the things we buy, we like online shopping that shows us only the things that will fit, or groceries we've bought before. But there is a certain point where the prediction of our wants and needs becomes too good, and now we feel like we're being followed. This tweet illustrates this idea very well:

More spooky is the account in the New York times of Target, the US retailer, using shopping data to identify which of its customers are pregnant, and offering them coupons for baby equipment. If the coupons were all for baby things, there were complaints (not least from a father whose teenage daughter had received them). If they were scattered through a range of randomly selected other offers, then they were accepted.

Although very little we do these days is private, most of the time, we don't like to think about it, and would rather live with the illusion of anonymity but the benefits of personalisation. Supermarkets use vast data and analysis to arrange their stores. Many companies use game theory to analyse decision-making processes, and ensure as many sales as possible. Where does the line lie between good customer service and manipulation?

The question about where the boundary should fall between these positions will define our interactions with data in the next few years. That may mean companies being more transparent about how and why they want to use this data, and consumers getting used to the reality that personalisation and convenience is often achieved by sharing data.

Filter Blog Entries

Archive

Subscribe

Click here to subscribe to the Big data

Add your comment

In order to post a comment you need to
be registered and signed in.

Louise Marston
17 May 12, 11:09am (10 months ago)

Do we know what we want?

Hi gopaldass,

I think that's an interesting idea. Part of the problem here is that the processes that companies use to analyse data such as shopping behaviour are more aware of our actions than we are. Humans are very poor at having a conscious awareness of how our brains work. Asking people what they will do in a situation usually produces very different results to putting people in a situation and observing what actually happens.

On the other hand, something that Google has done very successfully - is actually the foundation of their business - is using tiny pieces of information typed into search engine boxes to interpret intention (especially intention to buy).

So I'm sure there is much more to do here, but we shouldn't underestimate the extent to which our intentions are hidden, even from ourselves.

gopaldass
16 May 12, 12:02pm (10 months ago)

From contrived actions to unpredictable outcomes

Hi Louise, for the past couple of years I've been working on an idea that may be related to your commentary on this post or perhaps it is at a complete tangent. Look forward to your thoughts.

I read the fascinating article in the New Times a while back. Whilst the approach taken by Target is fairly precise and very successful, it is also intrinsically flawed as you've suggested by drawing parallels with the 'uncanny valley' theory. This might be because supermarkets and companies equate good-sales or service with customers eventually participating in a series of crude, linear actions. This is what creates fear and suspicion.

Let's turn this idea on its head, if we assume a commercial transaction is preceded by intent, can we create mechanisms that enable or encourage customers to articulate that intent, to put it quite simply, perhaps a more reflective approach to online shopping! I think this model would accommodate various, perhaps unpredictable outcomes. In which case, all our actions aren't distilled into contrived transactions.

As I mentioned earlier, the idea is still a work in process. Look forward to your thoughts,

@gopaldass