About Nesta

Nesta is an innovation foundation. For us, innovation means turning bold ideas into reality and changing lives for the better. We use our expertise, skills and funding in areas where there are big challenges facing society.

Measuring impacts, building insights, and responding to the unexpected!

Launched in February 2018, Nesta’s Future Ready Fund supports ten innovative, early-stage interventions that have the potential to promote the social and emotional skills that young people aged 11 to 18 years need in order to thrive in the future. My team at the University of Sussex, based in the School of Psychology and the interdisciplinary Centre for Research and Innovation in Childhood and Youth, was selected as the evaluation partner for the fund. One of our key goals is to support the grantees in adopting a culture of rigorous and well-planned evaluation that is fully integrated with their project delivery. As we approach the end of the 18-month Future Ready Fund delivery cycle, it is an opportune time to reflect on the grantees’ evaluation journeys to date.

What are we measuring? What insights can we gain?

Right from the outset, we were keen to work with the grantees to unpack the apparently simple question often associated with evaluation – ‘Does it work?’. Especially in the case of social and emotional skills, we have found that it really helps to be precise about the outcomes of interest. Discussions in this area often start with a general idea that interventions can improve ‘resilience’, ‘teamwork’, ‘communication’, or ‘confidence’, but one of the exercises we went through – in the course of developing/refining the theory of change – was to spell out exactly what was meant. We found it helpful to look at specific changes in: cognition (how is the young person thinking?), emotion (how is the young person feeling?), behaviour (how is the young person behaving?), and motivation (what goals is the young person pursuing?). This precision has really helped the grantees make informed choices from the diverse measurement tools found in the literature, as shown for example in the large number of tried and tested measures in the SPECTRUM database.

We were also able to think through complex questions about which informants to target (e.g., self-report, or teacher report, or both?) and exactly how and when they could be approached, bearing in mind everything else happening in busy school environments. It was important for the grantees to properly research, evaluate, and select the best measures for their projects, and to feel confident in their chosen tools before using them.

Rather than relying on ad hoc self-assessment questions, we also encouraged grantees to use standardised measurement tools which have been shown in published and peer reviewed work to produce reliable and valid data on the constructs they are trying to measure. But the value of using standardised measures is always dependent on the robustness of the evaluation design. In the case of the Future Ready Fund, nearly all grantees worked with us to plan both a baseline (before starting to implement the intervention) and a follow-up time point (after implementing the intervention), with a systematic approach to data collection. And some, but not all, of the grantees have been able to isolate the effect of their intervention by administering the measures at the same times to a control group.

But of course, changes in scores on standardised measures can never, in and of themselves, clarify the how and why of intervention impacts. We have spent a lot of time working through strategies for combining the use of standardised measures with qualitative analysis techniques. Moving far beyond collecting ‘nice quotes’ to use as testimonials, we have helped grantees to systematically build interview or focus group topic guides and rigorous analytic techniques to generate evidence on the mechanisms of change. We try to give particular attention to the contextual factors that moderate intervention impacts, identifying factors that either facilitate or hinder the desired outcomes. Combining such qualitative approaches with quantitative tools, within a mixed-methods process evaluation, is particularly helpful for understanding interventions to promote students’ social and emotional skills. These interventions often sit alongside many other influential factors in busy school environments (e.g., school policies, daily experiences inside and outside the classroom, home life issues, and changing dynamics in peer relationships). Most of those other factors are not under the grantees’ control, but the more insight that can be gained into how they relate to their interventions, the more focused and effective programme delivery can become in the future.

Responding to the unexpected

One challenge no one could have anticipated is the ongoing Covid-19 pandemic. It’s hard enough for organisations to cope with all the things that can happen in school settings to disrupt the best laid plans, but the pandemic has brought about truly monumental transformations for the work of the FRF grantees. These changes are both internal, in terms of managing the challenges of work and home life for project staff, and external, in terms of responding to the vastly changed contexts of the project activity (not least the closure of schools). It is worth noting at this point that Nesta is providing regular updates on the measures it has put in place to support organisations in the current context.

Even though the project delivery plans – and the built-in evaluation activities – have been massively disrupted, it is rewarding to see how the grantees’ commitment to an evaluation culture can continue to generate insights regarding both the key outcomes of interest and the mechanisms that might be driving those outcomes. I think this is largely because we have worked together to move away from a simple tick-box exercise where evaluation is something that ‘has to be done’ to demonstrate impact. Rather, the data collection that we have undertaken together – including the standardised measures and qualitative work already completed, as well as new activity that is currently being planned as part of some of the grantees’ pivot to online delivery – is generating formative insights into what is going on for the young people who are at the heart of all the FRF projects.

Expecting organisations to be resilient in the context of the extraordinary pressures we are facing now is a huge ask. But one thing that really helps to build that resilience is the investment of time and resource into integrating evaluation into project delivery. If we have this in place, then – even though we may not know what will happen next in this unpredictable new reality – we’ll have a richer and data-informed understanding of what we’re doing, in order to help us respond in the best way we possibly can.

Author

Robin Banerjee

Professor Robin Banerjee is Head of the School of Psychology at the University of Sussex, where he also leads the CRESS lab.