You can't manage what you can't measure

In the third of our series learning from the US Cities of Service, we explore what it means to have an impact and use data to drive decision making.

Impact volunteering is at the heart of the Cities of Service model. At its essence all volunteering should be impact volunteering – the simple focus of channelling volunteer time where it can deliver the greatest difference and being able to demonstrate that difference through data and evidence.

In reality though, it’s difficult to focus on impact and outcomes, and all too often we can dwell on outputs – numbers of volunteers, volunteer hours, number of training sessions held – rather than the difference those hours have made and how to make them more effective.

In God we trust, all others must bring data

The Chief Service Officers leading Cities of Service in the US have kept their dogged focus on the metrics, as Michael Drake noted when he spoke to our UK Cities of Service in October. Data must inform decisions – if the data shows that it’s not working, don’t be afraid to stop doing it. By creating a focus on the data in the pilot, Michael was able to show that his Love Your School initiative could make a significant dent in childhood obesity in Little Rock’s schools, with 50% of students reducing their body fat, whilst at the same time increasing average literacy and maths scores of pupils. Proving this in the pilot phase was crucial in being able to secure a further $100,000 of grant funding to extend the programme to 3,000 pupils across the city.

Good data takes investment

Sometimes it’s about making specific investments to get the right data. For Little Rock, one of the best investments was in a body composition analysis machine to help them measure the health of the young people on their healthy school programme. For $10,000 they’ve used it again and again to prove their impact and justify further investment. And it only occurred because their original partner for data collection couldn’t deliver in their first year so they had to find an alternative. Things not going to plan forced them to make the investment, and have a conversation with the local health service, who now provide volunteer nurses to help conduct the analysis.

Change won’t happen overnight

When designing the data collection method, it’s important to give things a chance to take effect. It took a full year to see an impact on childhood obesity in Little Rock, but with confidence from partners in the data collection method, they were able to wait.

Plan your data collection

Make sure your metrics have a meaningful connection to your initiative: don’t just focus on what’s easy to collect, but what the real indicators of change are. The first step in this process is to develop a Theory of Change to really test assumptions on what the initiative is trying to achieve and how. The second step is to create impact metrics for each initiative – what is the data point that will show whether you’ve achieved that outcome? 3 or 4 meaningful metrics will be much more valuable than lots of tangential ones.

The art of a good question

Make sure the questions you ask don’t invalidate the data before you begin, i.e. that they are neutral. When asking questions about satisfaction, don’t bias the sample by asking “do you agree that…?” And make sure that the method of data collection is neutral too – who is asking the questions? What’s their connection to the programme? All of this could affect the responses you get and if the data is biased, it’s meaningless.

Have a mix of qualitative and quantitate data and take the time to develop the right qualitative questions. In Little Rock, Michael worked with Health Professionals in the University of Arkansas to develop the questions and surveys. Plymouth has already taken this heart, working with dieticians in the local university to develop metrics and data collection methods for their Grow, Share, Cook initiative.

Own your data

If you want to be able to report on it, then you need to own it. When working in schools on attainment programmes, it’s essential to get agreements up front on what data you will get access to – be it test scores, attendance records or behaviour reports. If the data isn’t within your grasp, then you’ll fall at the final hurdle in being able to evidence your hard work – a lesson that some of the US cities learnt the hard way, when schools data or offender rates weren’t under their jurisdiction and out of their reach.

Avoid mission creep

If you really want to have impact, then staying focused will be imperative. As people get excited about your programme, they’ll see it as a way to solve all manner of issues – but the more you dilute your programme, the less impact you’ll have. The Theory of Change proves a useful tool to test whether the programme is being true to its ultimate goal and that each activity really adds value to achieving that goal. Saying no is an art form, but one that can be essential to achieving impact.

Be prepared to let go  

Using the above will help set you up for success, but things can change. Although you’ve set agreements to start with, use the regular check points to make sure that the programme is delivering your objectives not just delivering for the sake of it. It’s better to make adjustments mid-stream or stop altogether than keep going with something that isn’t working and won’t deliver the impact you need. Let the data inform your decisions so you can better manage your impact, and your effort.

 

Photo credit: By West Midlands Police from West Midlands, United Kingdom (101 tape measure  Uploaded by palnatoke) via Wikimedia Commons

Author

Meera Chadha

Meera Chadha

Meera Chadha

Programme Manager

Meera was a Programme Manager in the Innovation Lab working in the Centre for Social Action to support social innovations to scale their reach and grow their impact. She led Cities of …

View profile