Geo-Wiki: Good data quality can be achieved from crowdsourcing

A Geo-Wiki study of more than 53,000 samples of human impact and land cover collected from over 60 individuals of varying expertise showed that the non-experts were as good as the experts at identifying human impact on land cover, and with additional training, could perform as well as experts.

Wheat © Fodor90 | iStock

Wheat

ESO researcher Linda See and colleagues [1] validated 53,000 samples of land cover data checked by both "amateurs" and "experts," over the past five years under the IIASA Geo-Wiki project. They reported in the journal, PLOS one, that citizen scientists performed as well as and in some cases better than experts, including experts in remote sensing and geospatial sciences.  

Geo-Wiki uses satellite data to categorize land cover and identify places where people live and farm. The Earth Observation Systems (ESO) researchers checked 53,000 categorizations made by amateurs and experts, concluding that the former identified land cover correctly 69% of the time while so-called citizen scientists made correct classifications 62% of the time. Experts were better at identifying the specific land-cover types such as forest, farmland, grassland, or desert.  

Citizen scientists are amateur and nonprofessional scientists who participate in scientific research voluntarily. Citizen science is also known as crowd science, crowd-sourced science, or networked science. 

The researchers note that citizen science is becoming more important, given that government funding has been reduced for research and the development of the internet has allowed the participation of more people in research.

The Geo-Wiki concept has proved to be an attractive issue for the media [2] [3] [4].

Figure 1. The spatial distribution of crowdsourced data collected by the crowd, showing the degree of human impact globally.


References

[1] See L, Comber A, Salk C, Fritz S, van der Velde M, Perger C, Schill C, McCallum I, Kraxner F, Obersteiner M (2013). Comparing the quality of crowdsourced data contributed by experts and non-experts. 31 July DOI: 10.1371/journal.pone.0069958 

[2] Spectrum. Episode.
[3] How do "Citizen Scientists" stack up against the experts? Citizen Science.
[4] Experts trumped by citizen scientists in land use analysis project. News, Science and Space.

Collaborators

Dr. Alexis Comber, Department of Geography, University of Leicester, United Kingdom.


Print this page

Last edited: 22 May 2014

CONTACT DETAILS

Steffen Fritz

Program Director and Principal Research Scholar Strategic Initiatives Program

Principal Research Scholar Novel Data Ecosystems for Sustainability Research Group - Advancing Systems Analysis Program

PODCAST

Listen to a report on this study by Deutsche Welle's Spectrum radio show.

Further information

Events

Staff

International Institute for Applied Systems Analysis (IIASA)
Schlossplatz 1, A-2361 Laxenburg, Austria
Phone: (+43 2236) 807 0 Fax:(+43 2236) 71 313