Academics and policymakers alike are facing mounting pressures to improve the supply of rigorous evidence for decision-making. Yet improving the supply of good evidence is not the same as translating that evidence into meaningful changes in policy design and delivery.
When we consider evidence-informed policy-making, there are several actors that come to mind: the producers of evidence; its consumers who, hopefully, are influenced by what the evidence says; and, intermediary bodies who aim to bridge the gaps between producers and consumers. Yet these actors operate within their own systems of meaning-making, career incentives, and objective-setting. A policymaker may not be hired on the condition that they are able to read and understand academic jargon, while an academic’s job description (with the exception of some push from requirements such as the UK’s Research Excellence Framework) does not necessarily include generating evidence in direct response to policymakers’ needs. Instead, often what we see is publication for the sake of publication, or the dissemination of research evidence out into the world with the hope that this will bite with decision-makers. The problem is, it won’t- at least not without support in the right direction.
In order to improve the connection between supply and demand, resources are pumped into organisations and mechanisms that aim to improve the reach of academic evidence. Intermediary organisations all have their favourite methods of tackling the challenge, from networking workshops to secondments for scientists into Parliament, publications of what works toolkits to supporting academics in their disseminating of research findings. But how effective are these interventions in actually improving evidence-informed policymaking?
In order to determine exactly what works in ‘What Works’, the Alliance for Useful Evidence, in partnership with the EPPI Centre at University College London, did an exhaustive systematic review and scoping study of all the relevant research, and counted 150 different techniques in ‘The Science of Using Science’ report. The findings demonstrate the effectiveness of these techniques (or lack thereof), and challenge prevailing ideas about how academics and policymakers alike can communicate with each other. One take-away rang high above the rest: the dissemination of research evidence itself is not enough. Just as communication is impossible if those in the conversation speak different languages, increasing the supply of evidence is meaningless unless it sparks a demand. That is, by itself, dissemination is dead. Here are 6 evidence-based mechanisms of what works in evidence use that should be done instead:
A starting point for getting actors to engage with evidence is to build awareness of evidence-informed policy-making. Yet it’s not enough to merely spread the ‘evidence is good’ message and hope it sticks. Effective communication is about tailoring messaging in a way that is digestible for the audience, and considering how different sectors, from social workers to teachers, can find value-added in evidence-use. It is also about what is powerful and meaningful for audiences, and celebrating those who have made evidence the norm through rewards and recognition, such as the Political Studies Association Award for Best Use of Evidence.
It makes sense that in order for policymakers and academics to talk to each other, they need to understand one another. Yet there can be disagreement on what the word ‘evidence’ means, let alone how different evidence types can support different solutions. Consensus-building activities that address ‘what evidence for what purpose’ questions can help level the conversation playing field. Journal clubs, where people gather together to discuss the latest research findings, prompt deeper discussions into the nitty-gritty of evidence production and applicability, while structured consensus engagement techniques, such as the Delphi method, help to ensure all voices are heard when an evidence consensus needs to be made.
It’s never going to be enough to build a website summarising research and hope it will get used. Instead, evidence disseminators need to think more like marketeers, capable of targeting and tailoring content to make difficult concepts more accessible, engaging, and relevant. If you have an online evidence repository, make sure you have other things in place, like targeted and personalised messages, audience segmentation, or a user-friendly website that is hassle-free, like the Education Endowment Foundation’s Teaching and Learning Toolkit, which uses language that works for teachers and not just academics.
If you plan on engaging with policymakers directly, aim to do so during policy windows, such as before the design of a new social programme, where the likelihood of them engaging with you is higher. Frame communication in a way that is relatable to their needs, in either a positive or negative light (for example, The World Bank’s Report on Behavioural Insights demonstrates that in international development, framing evidence as a way of avoiding losses gets you further than framing as potential gains). Tap into social media channels, like Twitter and blogs, to ensure the evidence reach goes further. If presenting your evidence, tap into the power of narrative and use stories to help explain what problem the evidence is trying to address and make it more relatable. Lastly, getting the message across takes time. It’s important to repeat, remind, and refresh communications in order for a message to stick (did I mention dissemination is dead?)
A surprising finding was the lack of benefit from setting up interactions between policymakers and researchers, despite high levels of interest among academics for co-production. The research found that unstructured collaboration did not have much likelihood of success. Sticking academics and policymakers in a room together through loose forums for shared learning, such as a ‘community of practice’ or seminars without active educational components, won’t boost the knowledge and skills of decision-makers. Instead, create clear cut goals for what that relationship might achieve, and think about how it really encourages the use of evidence.
What does work, however, is having evidence champions willing to spearhead evidence-informed policymaking in their own organisations. While they don’t wear capes, they do play a key role in spreading their social influence. Through peer learning and network effects, investing in evidence leaders committed to the cause helps shifts norms toward evidence use. If the cool kids are promoting evidence-informed policy-making, we should, too, right?
There are a range of bodies offering Evidence 101-type courses for decision-makers and academics alike, such as the College of Policing’s, ‘evidence base camps’ or the Alliance’s own Evidence and Research Uptake Masterclasses. But the evidence suggests that you shouldn’t do training in isolation, and that passive learning, such as spending a half-day chatting to civil servants on the marvels of RCTs or ethnography, won’t cut it. While you are designing your training programme, tap into adult learning strategies, such as simulation based-learning, to make content worthwhile. Try to situate the learning within existing organisations and tap into their motivations. This helps incentivise people to put that new capability into practice and encourages a more embedded cultural shift towards evidence use.
It’s also worth thinking about how to hardwire evidence into day-to-day decision-making so that you’re not relying solely on a few champions in government wearing the evidence cape. It should be always on the radar, not something you think about when it’s too late. This desire for embeddedness is behind the evidence transparency framework of the Institute for Government, Sense about Science, and the Alliance for Useful Evidence. While the transparency of policy processes alone is not enough, it can prevent evidence from becoming an afterthought. There is some evidence on just how the evidence formalisation process takes shape. For example, the Wales Centre for Public Policy provides evidence-on-demand services, in which Welsh Ministers can request evidence to meet their current policy challenges.
There are plenty of other ways of encouraging research use, and additional insights to be found into how to go about bridging the gap between research and policy. The Institute for Government, for example, just released a report based on interviews with Whitehall officials on how government can engage with academia.
These mechanisms don’t work in silos and other factors will also be in play. Politics and power can dominate how research is thought about and used, while evidence is difficult to generalise from one context to another. It’s not just What Works, but for Whom, Why, When and Where. However hard it is to generalise, it’s good to be more rigorous — rather than fall back on the desire to disseminate. It’s time to be more evidence-based about evidence-based policy.