Office of data analytics (ODA) projects, especially when they involve data sharing across different local authorities, are complex.
To help you navigate the challenges in planning, designing and running an ODA project, using data legally and ethically, Nesta has recently published the draft of a “Complete Guide to Public Sector Data Analytics”. This guide details a 4-step methodology to help public authorities identify local challenges that could be tackled with data analytics.
We’ll now turn theory into practice, distilling three tips on how to keep the projects on track, being responsive to changes in requirements and keeping stakeholders on board at the same time.
Drawing from our experience piloting the London Office of Data Analytics on HMOs and working with the Essex Centre for Data Analytics, we believe that adopting an agile approach considerably helps in dealing with the novelty, complexity and scale of an ODA project.
Rather than following a linear development with predefined tasks and schedules, the project should be phased into small, incremental sprints. At the start of each sprint, its duration is defined and a running list of deliverables planned. Deliverables are prioritised by their business value and in case they cannot be completed on time, the work can be re-prioritised and the information used for future sprint planning. This will allow more flexibility in the workflow and will certainly help overcoming unexpected obstacles.
Another distinct feature of being agile is about improving collaboration between developers and customers. A high degree of communication and exchange is needed not only when starting out, but throughout the entire project, from the discovery phase to testing hypotheses and prototypes. This will ensure that you'll deliver value to the customers more rapidly and consistently.
It goes without saying that customers should always be involved in the decision to adopt agile practices in the first place, least of all to understand how much time they can commit to and their willingness to do so.
We are conscious that often, especially for local authorities, projects receive support and funding from senior leadership because they respond to overarching national or local strategic priorities. Therefore, the majority of these are firstly dictated and only explored in their feasibility in a second moment.
A discovery phase should be undertaken before the implementation of any product prototype. Some of it can be done through desk research, but it should also include expert interviews and user design workshops with people whose work the data analytics project is intended to support. Co-designing and testing potential solutions with professionals, both service managers and front line staff in the field, is crucial at every step of an ODA programme. This is the only way you can make sure the solution you are developing will be considered helpful and therefore adopted.
Last but not least, a data ethics assessment and an impact assessment on individuals and groups are essential when considering the risks of different approaches before they are put into action.
What happens if at the end of the discovery phase, the pre-defined objectives of the project do not seem achievable or realistic enough?
Projects should not be ‘doomed to succeed’.
If findings from the discovery phase are pointing at a change of direction, you should act on them, and come up with an intermediate or alternative route.
The project we’ve been recently working on, trying to identify data-driven approaches to tackling business exploitation of victims of modern slavery is one example of this responsive approach.
Over the months of the discovery phase, we tested our hypothesis and assumptions, researched best practices, regulations and data available and we’ve plumbed the depths of frontline professional experiences. The landmark Modern Slavery Act 2015 was the first of its kind in Europe to specifically address slavery and human trafficking in the 21st Century. However, we found that whilst this is a step toward identifying victims and bringing offenders to justice, the relatively young legislation brings with it little or disparate data, of often poor quality.
By conducting expert interviews and running workshops with frontline workers, we’ve recreated the entire journey of a business, from opening up shop to recruiting workers and delivering the service. This allowed us to identify businesses’ touchpoints with local authorities and other agencies in order to scope data availability.
Having investigated this and the current collaborative business inspection practice (or lack of) around it, we concluded that it would not be appropriate to build a predictive algorithm to identify businesses with a higher likelihood of exploiting modern slavery victims at this stage. This is something we could confidently communicate to senior stakeholders given it is based on research and evidence.
However, through our interviews we have identified an intermediate step.
What if we could integrate different sources of data to provide a cross-organisational view of inspection outcomes with a view to identify businesses of concern in general?
Instead of assuming that there are ways to predict what businesses are likely to exploit victims of modern slavery specifically, we will firstly test if we can collate the information shared by different agencies inspecting businesses in the local area to identify risk factors of businesses of concern more widely.
This phase paves the way to a more sophisticated data product in which a layer of prediction can be added. But in the meantime, it adds value to a broader range of organisations using the tool.
We appreciate that the process we’ve described until now looks quite daunting.
The narrative, unfortunately quite widespread, of many providers is that there is a technological fix (i.e. usually their product) that will magically do all the above for you. However, no matter how cutting-edge or expensive the product will be, we can confidently predict that it will not work if the solution is not crafted on your needs and organisational culture. This is because what you are introducing will need to support not only how the data is used, but also the way people access and work with it.
To know all this, we’re afraid you have to go through the entire process. But hopefully, at the end of the discovery phase you will be able to identify the features of a ‘minimum viable product’ to be tested with the end users.
We will get in touch soon to give you an update on our next phase, but until then, we’d like to hear your experiences. We are particularly interested in knowing what hurdles you’ve faced in deploying your data initiatives and your real life examples of adopting an agile methodology.
Do you have any additional tips on how to keep them on the right track to success?