How to reduce the effect of the 'power user' bias that emerges in most crowdsourcing platforms.
Many of the collaboratively developed knowledge platforms we discussed at our recent conference, At The Roots of Collective Intelligence, suffer from a well-known “contributors’ bias”.
More than 85% of Wikipedia’s entries have been written by men
OpenStack, as with most other Open Source projects, has seen the emergence of a small group of developers who author the majority of the projects. In fact 80% of the commits have been authored by slightly less than 8% of the authors, while 90% of the commits correspond to about 17% of all the authors.
GitHub's Be Social function allows users to “follow” other participants and receive notification of their activity. The most popular contributors tend therefore to attract other users to the projects they are working on. And Open Street Map has 1.2 million registered users, but less than 15% of them have produced the majority of the 13 million elements of information.
Research by Quattrone, Capra, De Meo (2015) showed that while the content mapped was not different between active and occasional mappers, the social composition of the power users led to a geographical bias, with less affluent areas remaining unmapped more frequently than urban centres.
These well-known biases in crowdsourcing information, also known as the 'power users' effect, were discussed by Professor Licia Capra from the Department of Engineering at UCL. Watch the video of her talk here.
In essence, despite the fact that crowd-sourcing platforms are inclusive and open to anyone willing to dedicate the time and effort, there is a process of self-selection. Different factors can explain why there are certain gender and socio economic groups that are drawn to specific activities, but it is clear that there is a progressive reduction of the diversity of contributors over time.
The effect is more extreme where there is the need for continuous contributions. As the Humanitarian Open StreetMap Team project data showed, humanitarian crises attract many users who contribute intensely for a short time, but only very few participants contribute regularly for a long time. Only a small proportion of power users continue editing or adding code for sustained periods. This effect begs two important questions: does the editing job of the active few skew the information made available, and what can be done to avoid this type of concentration?
Capra convincingly argued that a solution is to increase the visibility and transparency of the effect of biases. Displaying, for instance, the neighbourhoods or areas that are not mapped, or the topics that are not defined, encourages contributors to perform corrective actions to reduce the imbalance. Exposing the lack of certain content or voice does seem to attract people to compensate for the lack of information.
Other solutions can be designed into the software to ease some of the barriers that are potentially limiting the participation of some categories of users. Cutting contributions into small chunks rather the complex and lengthy tasks ensures that people with less time or digital skills can contribute. Just as the design of the interface and software can deter participation through complexity, it can also ease access to a wider section of internet users and support them in balancing the effects of some of the most obvious biases.
The issue of how to attract more volunteers and editors is more complex and is a constant challenge for any crowdsourcing platform. We can look back at when Wikipedia started losing contributors, which coincided with a period of tighter restrictions to the editing process. This suggests that alongside designing the interface in a way to make contributions easy to be created and shared, it is also necessary to design practices and social norms that are immediately and continuously inclusive.