Over the past couple of weeks, I’ve had the opportunity to speak twice about Nesta’s experiences in supporting local government to innovate with data through our Offices of Data Analytics (ODA) programme. The first was in front of an audience of mostly tech SMEs at Future Cities Catapult’s Challenges and Innovation in City Data event, and the second at a small roundtable hosted by Nesta and Mastodon C, in front of seasoned public sector innovators.
Though there have been many challenges and lessons along the way for the ODA programme, which aims to join up data held by local authorities to enable actionable insights, in this blog I outline the three I believe are the most important. They should be of equal interest to those working both inside and outside of the public sector to unlock the true value of its data.
1) Starting with a specific problem to be solved, for which data can offer impactful and actionable insight, is essential.
Though seemingly obvious, it’s hard to overestimate just how critical this point is. A clearly articulated problem will focus efforts to gather data and avoid just chucking whatever you have at an ill-defined issue, which wastes time as well as goodwill. A specific problem will also support a smoother information governance process, by providing a clear and legal rationale for why certain data needs to be shared.
But in a sector with no shortages of challenges big and small, it can be surprisingly tricky to pin down an actionable problem, especially one suited to a data solution. To help the public sector (and ourselves) better recognise these kinds of problem, we’ve combined our research with a framework developed by the New Orleans analytics team to highlight the types of challenges where data can help. They are categorised in the graphic below.
So don’t be afraid to take your time in finding ‘the right one’ and don’t settle for a vague problem statement, this will only lead to weak or less-than actionable insights. Equally, remember to involve end-users in this stage (and throughout the entire project process) as much as possible. Doing so will not only support stronger analysis, but will also build much needed buy-in for the initiative.
2) A data project in the public sector is just as often a change management project.
You can’t simply “do data to an organisation”, as one of our roundtable experts put it. Even a relatively small scale initiative is likely to bump up against fundamental ways in which the organisation operates on a day-to-day basis. Projects with data at their core need to design this reality into the their deployment plans.
First, there is the obvious change to consider involving new tech, methods and skills needed to support a given project, though it is possible to buy this in. The more difficult change to manage is the new way data projects require people to work together. For our London Office of Data Analytics (LODA) pilot, one consistent piece of feedback from partner boroughs was that it encouraged them to collaborate in novel ways across the organisation to meet the requirements of the pilot. Through this exposure to other teams, important gaps and opportunities for further collaboration were revealed.
Finally, there’s much to be said about the shift in organisational culture and mindset needed for projects with a strong data component. In particular, the generally high levels of risk aversion and territoriality that come with information sharing can be a major roadblock. Gaining the trust and buy-in from staff that may not see data as part of their role, or even view it with suspicion, is equally challenging.
These issues are difficult, but not insurmountable. Starting with a proof of concept phase and ensuring staff engagement throughout the process can go a long way.
3) Ultimately, quality data underpins everything.
At the Future Cities Catapult event, the evening’s five speakers all emphasised the preeminence of human relationships and the ‘people’ element of successful data projects. During the panel, an audience member instead asserted that access to quality data is the more essential piece. I would compromise and say they are equally important, as having even excellent data does not guarantee success (see points 1 and 2 above). However, it is also difficult to disagree with the fact that, with all the goodwill in the world, useful analysis cannot be undertaken without reasonably good data quality.
For the LODA pilot, which sought to identify unlicensed Houses in Multiple Occupation by joining up and analysing various data collected on all properties (e.g. location and age of property, complaints such as Anti-Social Behaviour or pests against it, occupancy numbers, council benefits received etc.) we started out with 12 boroughs. By the end of our data gathering process, which took roughly five months, just four boroughs were deemed to have sufficiently good quality data to proceed with analysis.
There were a number of reasons for this, discussed at length in our upcoming report on that pilot, but overall the lack of standardisation in address data made it exceedingly difficult to match records to a single property, ultimately limiting the potential for actionable analysis.
In future, matching and merging various datasets could be improved by using consistent address data across different departments and through use of a linking code such as Ordnance Survey Unique Property Reference Number (linking address information with the UPRN has helped many local authorities realise real savings). Nesta is also currently exploring the development of an online data maturity assessment tool for local authorities, which is meant to help spot data-related issues earlier in the project process so that adjustments can be made accordingly.
We have learned much and are exceedingly thankful to the local authorities that have partnered with us to further our collective understanding of how innovation happens in the public sector. We look forward to soon sharing our final report on the LODA pilot, as well as a summary report from our recent roundtable with public sector data innovators.