The DIY toolkit has been 12 months in the making: what you see here is our first cut set of best practice tools for social innovation. We’ve worked really hard to get here and hope you’ll agree that it has been worth it.
The project was initiated by Nesta and the Rockefeller Foundation back in the autumn of 2012 to create a toolkit that can be a ready reference for the international development sector. Both organisations share the view that better innovation skills will help social sector organisations to solve problems more effectively and generate innovations that improve the lives of the poor and vulnerable. It has been made in close collaboration with STBY and Quicksand in four phases.
A starting point for the initiative was a recognition that there is an abundance of social innovation tools and methods and an eagerness to find the most scalable solutions. Yet far too many people still spend time re-inventing the wheel for the lack of accessible and proven tools that can help them to bring their ideas to life. In a sea of options it is difficult for practitioners to identify the best tools and know which are most appropriate in given circumstances.
We saw a real need to draw together the most useful and practical tools and to make them relevant for development practitioners around the world.
“What do you mean by tool?” We heard that a lot. We were looking for really practical stuff: worksheets or templates that can be used in specific settings to facilitate project activities. Three months of desk research and 40 expert consultations later, we had found and indexed over 700 ‘tools’. Yet most of them aren’t actually tools; they are lengthy narratives or vague methods that can’t be put rapidly into use.
So based on a second filter of practicality we reduced the number to 212 and diligently assessed each on its individual merits: Is it visually engaging? Is it free? How actionable is it? Is there any evidence of impact? Is it self-explanatory? And most importantly, would we recommend it?
This sifting left us with 30 tools. The exact number wasn’t important: there could just as well have been 25 or 35. These were the tools that were judged to be ‘best-in-class’ - the most relevant and usable - that supported activity across all stages of innovation. We evaluated and validated the list with colleagues and associate organisations to make sure that what we had done made sense, that we’d been comprehensive enough in our search, and that the tools we’d included were actually useful.
We also spent a lot of time speaking to people in the sector to learn about the challenges they face, their needs and current practices. This provided detailed insights that we developed into initial hypotheses about the global landscape. We heard some really interesting things, such as:
“A lot of the people I work with are ‘sons of the soil’ who don’t adhere to western innovation frameworks, but instead they come with immense local knowledge and street-smartness.”
“Innovation for me is ‘something different’. I feel like you don’t need to be groundbreaking as long as you are creating positive change.”
“I don’t like it if someone says ‘do A-B-C-D’. Of course I would like to know what to do, but I would also like to know why and how, and to feel that I am enabling myself.”
These are just a snapshot: we had a lot of different needs and preferences to take on board to ensure that the toolkit was fit-for-purpose. We learnt that:
Based on all this insight our brilliant partners STBY and Quicksand took our initial list of tools and co-designed a prototype toolkit with the help of practitioners from international development agencies in Delhi and Bangalore. The prototype was made available for user testing on live issues in 30 organisations across four continents.
The user testing had been the most important aspect of the project. With so many other resources available, our offer is that the compilation of tools we’ve drawn together has been tried-and-tested.
The test community consisted of large and small organisations from around the world. The user testing lasted for 10 weeks with regular check-in meetings. You can see who those kind people are on our list of partners. It was great fun working with our users and we are very grateful to everyone who participated for their commitment and generosity.
The process validated and refined our early hypotheses. It generated real-time feedback and case studies that have helped to improve the toolkit and bring otherwise mundane material to life. The tools and case studies that you see here are testimony to that collaborative effort.
The testing doesn’t stop here. We know that best-in-class will not remain best-in-class forever so we will continue to iterate the toolkit. This website has been created so that we can do that with you. As pioneers in the field, we invite you to share your experiences and stories with us so that we can further strengthen the tools and make the toolkit more useful.
We hope you like it, but even if you don’t we’d love to hear why and how we can improve it.
Follow us on Twitter @diytoolkit