The ongoing struggle for personal data standards
There is an enduring challenge of how to develop a standard way to express how personal data should be handled. If widely adopted, it could pave the way to true 'data portability' and a privacy preserving internet.
The ongoing struggle for personal data standards
There are millions of different ways to structure, format and share data. There are also millions of different ways to express how data could and should be handled. If we want to empower individuals and communities to control, understand and use their data, standards have an important role to play. In theory, they can help us to overcome such differences and work together to achieve network effects and economies of scale.
Wouldn’t it be great if, regardless of which organisation I’m dealing with, there could be a standard way for me to express my preferences about how I want my data to be used? And a standard way for organisations to represent how they intend to use my data? Similarly, what if I want to take my data out of one service and use it in another? Wouldn’t it be great if there were a standard way to take my data - messages, media, contacts, behavioural trends - with me, and share or import them into another service?
These ideas have a long history, littered with many well-intentioned efforts. In the late 1990s, in the heyday of the dotcom bubble, the World Wide Web Consortium (W3C) began work on the Platform for Privacy Preferences (P3P). This would allow every web user to configure an intelligent software agent that would operate within their web browser, and automatically negotiate on their behalf with websites, who would themselves be running intelligent software agents.
Writing in 1999, Harvard law professor Lawrence Lessig suggested that these agents could take on the burdensome work of negotiating privacy, enabling a fast and efficient market for personal data which would appropriately balance privacy and utility. While it was partially adopted for several years, the P3P standard never evolved to be capable of that kind of dynamic and intelligent negotiation that Lessig and others dreamed of. The standard was officially recommended in 2002, but floundered in subsequent years before ultimately being abandoned.
While standards are great in theory, they often fail. One reason is simple short-termism; in trying to eliminate coordination costs in the long run, you end up imposing upfront costs on those who are expected to adopt the standard.
While standards are great in theory, they often fail. One reason is simple short-termism; in trying to eliminate coordination costs in the long run, you end up imposing upfront costs on those who are expected to adopt the standard. P3P required website owners to audit their personal data handling practices and encode them in machine-readable form; not an easy task for many website administrators. A more challenging reason is that standards are inherently political. Some stakeholders may be better off without standardisation, or with particular versions of standards – a classic example is the war between the VHS and Betamax video-tape formats in the 1980s.
In the case of standards for personal data control, companies whose business model depends on ‘locking in’ their customers have more to lose from standards that make it easier to get your data out. For instance, popular social networking sites and messenger applications prefer to use their own protocols to send messages between users, rather than use standard ones (like XMPP) which would allow communication across different platforms. Likewise, those with a vested interest in collecting personal data will lobby for standards that are less likely to restrict such collection.
The ‘Do Not Track’ standard, which aims to give people a very simple way to automatically broadcast their privacy preferences on the web, ended up mired in debates about what exactly constitutes ‘tracking’; unsurprisingly, representatives of the tracking industry wanted to define their activity out of existence.
Efforts to empower people and communities with their own data have much to gain from standards. Whether the goal is to compete against platform siloes, or achieve some halfway house between centralised and decentralised technologies, or simply ensure better competition between centralised providers, standards could help restore autonomy and equity to the personal data ecosystem. The challenge is to learn from past mistakes; most importantly, it means restructuring the incentives of major stakeholders so that adoption of standards makes economic sense.
The challenge is to learn from past mistakes; most importantly, it means restructuring the incentives of major stakeholders so that adoption of standards makes economic sense.
The European Union’s new General Data Protection Regulation (GDPR) may help here. First, organisations could face fines of up to four per cent annual turnover for non-compliance with the various principles. The threat of large penalties fundamentally changes organisations’ incentives to adopt standards, especially if those standards are generally regarded by regulators as ensuring compliance.
Moreover, many of the provisions in the GDPR explicitly require or suggest technical standards. For instance, the right to data portability (Article 20) gives individuals access to data they have provided to an organisation in a structured, commonly-used and machine-readable format, and the ability to easily transfer that data to another service. Article 12 encourages organisations to provide relevant information about their privacy practices via ‘standardised icons’, which (reminiscent of Lessig circa 1999) must be machine-readable if presented electronically.
While these legal regulations will inevitably change organisations’ incentives to adopt standards, they don’t guarantee that whichever standards are eventually adopted will empower end users or spur innovation. As data protection authorities and organisations in the public and private sector prepare for the incoming regulations, advocates of personal data empowerment have an opportunity to shape the accompanying standards to promote a more equitable data ecosystem.