The role of digital technologies in improving data and a challenge to innovators
The World Health Organization estimates that 92 per cent of the world’s population live in areas with unsafe levels of air pollution, with outdoor pollution linked to around three million deaths every year. In London, Mayor Sadiq Khan has identified the city’s bad air as the most significant environmental issue facing the capital. Indeed, London has just exceeded its legal air pollution limit for the whole year before the end of January (a marked improvement on recent years). So what should we do about it?
In the UK we have a good understanding of the main sources of pollution. In urban centres like London, road transport contributes around half of nitrogen dioxide and particulate matter emissions. The remainder is made up from multiple sources, including construction, industry, fuel burning and other transportation like rail and aviation.
Despite a good understanding of where pollutants are coming from, levels in London are consistently higher than legislative limits. Putney High Street, for example, last year exceeded its hourly limit for nitrogen dioxide over 1,200 times. By law, the limit should not be breached more than 18 times a year.
Progress in reducing these emissions is challenging because of difficult trade-offs between environmental, economic and political concerns. There are multiple actors, and changes are needed in government policy, corporate behaviour and individual actions. These varied actors need to be engaged in the issue, understand the problem and be aware of potential solutions.
For that you need data. It’s an old management adage that ‘you can’t manage what you can’t measure’, and often the more you can measure, the more power you have to affect an issue. The UK is in a position of strength when it comes to air quality data, with Defra operating a comprehensive national monitoring system. The high accuracy, high-cost network is supplemented by monitoring at local authority level and the gaps in space and time are filled with modelling.
However, the emergence of low-cost and distributed sensing, along with open and big data, has blown the possibilities of air quality data wide open.
We know that air pollution is a problem. And we know what the major sources are. So how does more data improve air quality?
Whilst a broad network of high-quality instruments provides a comprehensive overview of pollution trends and city-scale hotspots, these instruments can’t provide information to an individual about their personal exposure. Low-cost and distributed sensing could provide this information and allow individuals to adjust their behaviour accordingly, even in real-time. It can empower communities to build campaigns and put pressure on the government and private sector. Through citizen science, individuals can gain ownership of the issue by capturing data themselves.
Better data means that you can more effectively target interventions, from national and local policy to individual decisions. You can identify particular hotspots and sources that were previously hidden, and you can gain a better understanding of the impact of particular policies. This is crucial to designing effective strategies for managing air pollution.
Thinking more long-term, you could envision an autonomous smart city with traffic management systems that adjust dynamically based on the air quality data, or highly accurate predictive modelling of air quality that can influence decision making such as infrastructure planning and low emissions zones.
Using data effectively means getting the right information, to the right people, at the right time. It requires data to be robust and readily accessible, but also presented in the right way.
Air quality data is increasingly open, and aggregators have emerged to pool all this data into one place. Already, a range of apps uses monitoring information to inform interested users of the air quality around them, using various numerical and colour-coded indicators. Some companies are applying machine learning algorithms to create sophisticated real-time or predictive models of air quality.
However, the use of data to inspire change is still in its infancy. Could having air quality data communicated at the right time influence house prices? Or change people’s behaviour when buying a car? What about low emission route mapping? Or dynamic air conditioning in buildings that recycles internal air during high pollution incidents? New technological developments and increasingly open data enable new approaches. But at the same time as encouraging innovators to do more with existing air quality data, we need to improve the raw information itself.
Arguably the biggest innovation trend relating to air quality in recent years has been the emergence of low-cost, consumer level air quality monitors. These can be static, mobile, even wearable, and they have flooded the market, often with flashy marketing and glossy interfaces.
These sensors are a revolution in our toolkit for tackling air quality, but they are yet to deliver on their promise. The outside world is very different from a lab environment and air quality sensors are affected by a range of factors such as temperature, humidity and the presence of other compounds in the air. Building systems that are robust and reliable, especially over time, is no small feat and developers often lack the sophisticated laboratory facilities required, and rely on off the shelf components combined with complex (usually black-boxed) algorithms to account for technical shortfalls.
This has meant that the reliability of low-cost sensors are extremely variable and uncertainties in the data restrict what can be done with these devices. This inhibits large scale deployment, prevents data being integrated with other sources, and presents a risk of public backlash if some devices bought by consumers don’t generate reliable information.
There is currently no standardisation or accreditation for low-cost air quality monitors and so no simple way for a consumer to know how accurate devices are. These concerns have been highlighted in a recent comment piece in Nature. The UN World Meteorological Organisation and the US Environmental Protection Agency have released guidance on the use of low-cost sensors, while the European Union has a working group beginning a standardisation process.
We believe a logical next destination on the roadmap to unlocking the potential of low-cost sensors is an innovation challenge to developers, providing incubation support to a range of projects to realise an ecosystem of properly validated sensors.
Innovation challenges are a good mechanism to focus the efforts of a community of innovators on a particular problem and to encourage new ideas from unusual sources. Following the success of the Dynamic Demand Challenge, the Challenge Prize Centre at Nesta has teamed up again with the National Physical Laboratory (NPL) and developed a proposal for a new innovation prize to improve air quality. The objective is to enable better use of data to tackle air quality by supporting the development of higher quality low-cost air quality monitors, and encouraging novel uses of air quality data to create the behaviour change that will lead to better health outcomes.
NPL has unique testing facilities that can support the development of air quality monitors and are involved in the standardisation discussions, so entrants would come out with properly tested and market ready devices. Nesta has a history of data challenges, including the open data challenge series and the Open Up Challenge.
Solving our air quality crisis will require a mix of effective policy and social change. Fit-for-purpose low-cost sensors and accessible data has the potential to be an enabler of both.
We're currently looking for partners to make this innovation challenge a reality. If you’d like to find out more, please email: [email protected]