The Centre for Social Action Innovation Fund, ran from April 2013 to March 2016. It was a £14 million fund, delivered in partnership with the Cabinet Office, to support the growth of innovations that mobilise people’s energy and talents to help each other, working alongside public services.

In total, we received more than 1,400 expressions of interest. We backed a portfolio of 52 innovations, investing £11.5m in grants and a further £3m in non-financial advice and support, including rigorous evaluation of the outcomes for the people helped by these innovations.

The work focused on innovations in six priority areas where there was a plausible case for how social action could make a difference, and where we felt that the current solutions were underused.

The method

The Centre for Social Action Innovation Fund was the first time Nesta systematically integrated our Standards of Evidence into our grant-making, following its development in our Impact Investments work.

We wanted to work with projects to use evidence to increase their understanding about the impact of their work. We also believed that the innovations would need to share good quality evidence with their funders, commissioners, volunteers and beneficiaries if they were going to be successful in scaling their work.

It was important to clearly articulate purpose from the outset – to help grantees get the evidence they needed to know what works and to scale that sustainably. The approach aimed not just to end up with a series of robust evaluation reports, but to create genuinely useful insights that could be used to improve the approach and, where possible, to demonstrate its effectiveness.

We wanted to work with projects to use evidence to increase their understanding about the impact of their work.

To support the development of evidence, every innovation was given support to develop a theory of change and gather data through the lifetime of the grant to improve their evidence of impact. Each innovation was supported by Nesta and our evidence partner The Social Innovation Partnership (TSIP) to commission independent evaluations of their work, to ensure an appropriate approach was designed and a good quality brief developed for the evaluators.

Each grantee’s evaluation was co-designed with them, ensuring that it was tailored to their stage of development. On occasion this meant using control groups or similar, but it also meant more exploratory evaluations to help the grantees improve the design or delivery of their innovation, and/or putting systems and processes in place that better enabled grantees to monitor their impact over time.

Each innovations’ evaluations were independently verified at the end of the programme to see if they had been able to increase the quality of their evidence of impact, and how confident we could be in the quality of the evidence.

The impact

Spotlight on The Access Project

The Access Project (TAP) works with bright students from disadvantaged backgrounds, providing in-school support and personalised tuition, to help them gain access to top universities. It was awarded £100,953; £15,000 of this was for the evaluation.

At the start of the funding period, TAP was validated at Level 2 on the Nesta Standards of Evidence - that is, they had data that could begin to show effect but not causality. At that stage they were comparing the value-added scores of TAP pupils with pupils from the same school who did not take part in TAP. This provided an interesting benchmark, but there was likely to be some systematic difference between those pupils who did and did not take part in TAP.

The Access Project commissioned the National Institute of Economic and Social Research (NIESR), with the aim of strengthening its own capacity to evaluate effectiveness in a more robust and rigorous way.

Working with NIESR and building on its work to date, TAP developed a number of tools and approaches for the evaluation work. To improve rigour, the team developed a matched comparison group design, using propensity score matching and national pupil data. This is a quasi-experimental approach, used to estimate the difference in outcomes between people receiving the innovation and those who don’t that is attributable to the approach. The data produced was assessed at Level 3 on the Nesta Standards of Evidence - that is it demonstrated that the work caused impact, by showing less impact amongst those who do not receive the TAP support.

While TAP invested considerable resources internally on monitoring and evaluation before, this was the first time it looked externally for expertise.

Propensity score matching is widely considered a robust approach to creating a comparison group, provided that the factors on which participants are matched are sufficiently comprehensive and meaningful. The Access Project was unable to include ‘level of motivation’ (a variable that could affect how effective the work was if only highly motivated students were receiving the support), as a matching factor. It was, however, able to provide evidence to successfully make the case that this does not significantly weaken the findings.

For university places the team developed data showing the change in number of pupils attending top universities from each school, from before The Access Project started working with them to after. This was assessed at Level 2 on the Nesta Standards of Evidence, showing impact but not causality.

While TAP invested considerable resources internally on monitoring and evaluation before, this was the first time it looked externally for expertise. Working with NIESR and other external partners allowed TAP to develop a set of tools it could take on and integrate into its yearly operations, both in terms of impact measurement but also programme development and learning. For example, the learning informed The Access Project to change the staff structure to support the delivery in schools in new ways.

This approach increased the team’s confidence level in the effectiveness of the programme - introducing more rigorous approaches that mitigate against bias, and can therefore confirm with more certainty and accuracy that the programme is having a positive impact. This approach was also really cost effective - with £15,000 spent on external support for their evaluation. This has also helped TAP make the case for its work, make improvements to the programme, and continue to scale to support many more people.

The Access Project GCSE grade improvement infographic

At the beginning of the grant, The Access Project was working in London with 600 volunteers, supporting around 600 beneficiaries each academic year. By the end of the grant, it had expanded to include the West Midlands, and was working with almost 1,000 volunteers and 1,200 beneficiaries.

The Access Project has continued to grow its work beyond the lifetime of the fund, and is now working in the East Midlands, West Midlands and London, and experimenting with online tutoring as a way to reach even more young people through the Click Connect Learn programme.

The role of evidence continues to be critical to the delivery of The Access Project’s work, and it continues to use and build on the methods of evaluation it developed through the fund.

More information on the evaluation approach and the individual evaluations from the Centre for Social Action Innovation Fund can be found on the published evidence bank.