About Nesta

Nesta is an innovation foundation. For us, innovation means turning bold ideas into reality and changing lives for the better. We use our expertise, skills and funding in areas where there are big challenges facing society.

Building trust and sharing value: the twin challenges of health and care data

Data has become one of the most important resources in modern healthcare. The collection, sharing and analysing of data can help identify disease earlier, has influenced cancer pathways, and has transformed the patient experience in everything from asthma to diabetes.

Data about our health and wellbeing does not just sit in the health or care system - it sits in many domains of our lives. Online search data can point to someone being in mental distress long before they ask for help. Our shopping receipts say a lot about more about our eating, drinking or smoking habits than our health record. Smartphone apps track everything from diabetes to our bowel movements and many of us record our physical activity with FitBits or Apple Watches.

With one trillion new Internet of Things (IoT) devices, such as wearables and embedded sensors, predicted to be produced by 2035, we will be able to track and analyse every aspect of our health and care. While this could unlock a future of predictive prevention and precision care, it also holds significant challenges for citizens with regard to trust in the use of the data, as well as questions around the value and exploitation of the data.

Nesta’s Health Lab has long called for a people-powered and knowledge-driven health and care system both through our research and our practical programmes. We are pleased to be working in partnership with the Scottish Government on a new programme and fund that will explore how to build an ecosystem of trust when using citizens’ health and care data, as well as funding innovation in how to capture and share the value of health and care data for the benefit of all.

The challenge of trust

The past couple of years have seen an erosion of the public’s trust in the use of their data in the wake of high profile scandals such as Cambridge Analytica’s misuse of 73 million Facebook profiles to influence the US presidential election. Initiatives like care.data, which aimed to join up patient data from GPs and hospitals at a national level showed the repercussions of getting this wrong in health. This controversial programme came under criticism over the lack of clarity around options for opting out of the scheme, with confusion and concern over safeguards to protect privacy following frequent data breaches and the possibility of selling personal data to commercial companies.

In another misuse of trust, the biggest NHS-approved online pharmacy, Pharmacy2U, was fined £130,000 in 2015 for selling information about more than 200,000 customers to marketing companies. In their ruling, the Information Commissioner's Office said it was likely some customers had suffered financially as one buyer of the data deliberately targeted elderly and vulnerable people.

There are growing pockets of good practice to learn from. Genomics England’s 100,000 Genomes Project is a good example of how to build trust with citizens. To date 100,000 whole genomes from NHS patients and their families with cancer or a rare disease have been sequenced and linked. As a direct result of the project, one in four participants with rare diseases have received a diagnosis for the first time, and up to half of cancer patients involved have been offered an opportunity to take part in a clinical trial or to receive a targeted therapy. Participant privacy and confidentiality is vital. The data sits in a safe haven and only clinicians involved in the direct care of a patient can access the information, while researchers can apply to access anonymised information. Citizen volunteers, whose DNA are part of the database, sit on a Participant Panel and help make decisions on who accesses the data, and help shape the information for the public as the project progresses.

AI and automated decision-making (ADM) systems are creeping into every area of our lives from deciding who gets a loan to who gets hired. To grow an ecosystem of trust, as well as ensuring health and care data is safe and secure, we increasingly need to think about the ethics around the use of artificial intelligence (AI) in health and care decisions.

With high profile media stories about algorithmic bias and, sadly, the world’s first death by an autonomous vehicle, citizens need to be able to put their trust in these systems, or at least know how to challenge their decisions. As we argued in our report last year, Confronting Dr Robot, AI could transform healthcare, with potentially huge benefits, but there’s a risk that it could make decisions that deny access to care with “the computer saying no”. Could we envision a future where access to treatment is denied to someone on the basis of them refusing to do their 10,000 steps a day? We’re already seeing countries like China pioneering a social credit system using data and AI to track its citizens and reward or deny them access to services.

We think such a dystopian future could never happen here but many police forces are already using predictive policing algorithms amidst concern that the programmes encourage racial profiling and discrimination. And we are reminded that in 2017 news broke that the patient records of more than 8,000 people were handed over by the NHS to the Home Office as part of its drive to track down immigration offenders, showing the UK is not adverse to using data to track its citizens.

Scotland is leading the way in anticipating the ethical and human rights considerations of using biometric data by the creation of a Scottish Biometrics Commissioner who will provide independent oversight and a code of practice to promote good practice and consistent standards in relation to biometric data. While this body will focus on the use of biometric data in the context of policing and the criminal justice system, a similar model could be used for sensitive biometric health data such as voice pattern analysis.

At last count, there are over 70 AI ethics or ADM frameworks globally but very little practical application of how AI and ADMs could impact on people’s lives in health and care. The RSA recently surveyed a representative sample of 2,000 people to explore attitudes to artificial intelligence and its use in decision-making. Interestingly, 36% noted that their support for these systems would increase if they were granted the right to request an explanation of the processes undertaken to reach a decision.

As more complex and personal data is held about us digitally, including genomic and biometric data, it’s becoming ever more important to build public trust around how data is used, both at an individual and population health level - now and in the future.

The value of health and care data

As well as building trust and protecting people’s privacy, we need to have a more nuanced debate about the value of sharing data if we are going to unlock its benefits, not just for an individual’s care, but for others with similar conditions, as well as the clinical and research community. The Wellcome Trust’s Understanding Patient Data programme has been instrumental in opening up the debate around the value of sharing patient data beyond individual care.

SHARE is a Scottish initiative created to establish a register of people interested in participating in health research and who agree to allow SHARE to check whether they might be suitable for health research studies. There are a number of clear benefits to signing up. Firstly, there is the altruistic benefit of the pooled data being used for medical research to benefit the public. Secondly, participants might be invited to take part in a research study in the future (at no obligation) which could benefit their treatment or condition. Lastly, if any clinically useful information comes to light this may be added to their health records (via strict NHS approved protocols) to guide their treatment. SHARE is a good example of value sharing by providing an easy way for people to get involved, clear benefits for research that donors might benefit from, and importantly, the data being held securely with clear opt-out mechanisms. SHARE has recently hit the 250,000 registered patients milestone, demonstrating willingness of the Scottish public to share their health information for research purposes. Population-wise, that would be the equivalent of over 2.5 million patients in England voluntarily signing up.

We also need to engage the public in conversations and debate about the use of their data that go beyond polls or surveys. Genomics England, the Scottish Genomes Partnership and UK Research and Innovation's Sciencewise programme recently engaged nearly 100 members of the public, and 30 experts in events in Coventry, Edinburgh, Leeds and London. The facilitated events and subsequent report explored public aspirations, concerns and expectations about the development of genomic medicine in the UK. Almost all participants responded positively and many developed high expectations of genomics, envisaging a near-term future with new treatments and personalisation of care, and significant cost savings for the NHS. Most were relaxed about their health and genomic data being used in health research, provided that this was managed carefully but had clear limits on uses such as genetic engineering, predictive insurance tests and targeted marketing.

"This report highlights the crucial role that ethics and participant engagement play in establishing and maintaining public trust in genomics".

Professor Mark Caulfield, Chief Executive of Genomics England

These are welcome findings but we need to go further and think beyond the boundaries of the health system if we are going to truly maximise the potential of data to improve our health and wellbeing. A PWC report in 2017 found that although a majority of the public (92%) are happy for their health records to be shared across the health system, only 20% are comfortable with their health records being accessed by healthcare apps and 16% by healthcare technology companies.

who should have access to health records

Media stories such as Google’s DeepMind partnership with the Royal Free NHS Foundation Trust compounds fears like these. The Trust provided personal data of around 1.6 million patients as part of a trial to test an alert, diagnosis and detection system for acute kidney injury - a life threatening condition where minutes count - using their mobile app Streams. An ICO investigation found several shortcomings in how the data was handled, including that patients were not adequately informed that their data would be used as part of the test.

The lack of trust in our data being utilised by private companies is understandable but it runs the risk of stifling much needed innovation. The NHS do not have the resources or skills to be at the cutting edge of developing data-driven digital health technologies, and is unlikely to for the foreseeable future. They will need to increasingly partner with startups in order to innovate and provide better targeted interventions and care.

“We never killed anyone by using their data. I’m pretty sure we have by not using their data”.

Chris Carrigan, Head, National Cancer Intelligence Network

The most valuable insights can be created when different data sources are linked and new patterns and associations are found. But data about our "health" sits in many siloed domains from data about our interactions with health and care services, to the over the counter medicines we self prescribe, or the environment in which we live. Linking these disparate data sets could allow us to intervene earlier in someone’s health journey. A London child, Ella Kissi-Debrah, hit the headlines recently as potentially the first person in the UK to be killed by air pollution. It’s likely unlawful levels of pollution, which were detected one mile from Ella's home, contributed to her fatal asthma attack. Could the outcome have been different if Ella’s parents had access to this data in real time or if local authorities were using this data to take more targeted preventive action?

Huge amounts of value sits in the data that the large tech giants gather about us every day. A company like Amazon probably knows you better than you know yourself from data about the books and food you buy and the TV shows you watch; companies like Facebook and Google also harvest, hold and sell huge amounts of personal data about us. According to Google, 1 in 22 online searches is for a health-related issue and a staggering 51% of Youtube searches relate to health. As these companies hold increasingly valuable data sets about our health and wellbeing, a big question is who benefits from this data and how can these benefits be shared.

So what’s Nesta doing in this space?

As part of our research agenda into data-driven health and care, we want to explore the twin challenges and opportunities of trust and value.

We are pulling together the existing research on public attitudes to health and care data over the next couple of months to be published at the end of the summer. The paper will examine the current challenges to building trust and value when using people’s health and care data, but also explore areas of innovation which we hope to capture in an online living repository of case studies. We are particularly interested in social care as there is a lack of research in this space.

Building on work from colleagues across Nesta, we hope to uncover new data models that put the citizen in control of their data, ranging from personal data stores to public data trusts. We plan to share examples of new methods such as collective intelligence design unlocking new insights into health, and hope to discover new models of consent in data sharing and participation that move beyond a yes / no tick box.

To put this into action we’ll be launching an innovation call in partnership with the Scottish Government in late summer looking to fund, support and test projects that are taking an innovative approach to these challenges of trust and value. We’re also building on Nesta’s work in inclusive innovation and new methods of public engagement to experiment with new ways of engaging Scottish citizens, from all walks of life, in meaningful conversations about the value of their health and care data and how to build an ecosystem of trust that works for everyone.

If you would like to get involved in our research in any way or have a demonstrator project or innovative public engagement method that you would like to discuss with us, we’d love to hear from you. Please get in contact with Sinead Mac Manus, Senior Programme Manager, Health Lab.

Author

Sinead Mac Manus

Sinead Mac Manus

Sinead Mac Manus

Senior Programme Manager, Digital Health

Sinead was a Senior Programme Manager for Digital Health in the Health Lab.

View profile