Reviewing versus doing: Learning and performance in crowd assessment

Haiyi Zhu, Steven P. Dow, Robert E. Kraut, Aniket Kittur

Research output: Chapter in Book/Report/Conference proceedingConference contribution

32 Scopus citations

Abstract

In modern crowdsourcing markets, requesters face the challenge of training and managing large transient workforces. Requesters can hire peer workers to review others' work, but the value may be marginal, especially if the reviewers lack requisite knowledge. Our research explores if and how workers learn and improve their performance in a task domain by serving as peer reviewers. Further, we investigate whether peer reviewing may be more effective in teams where the reviewers can reach consensus through discussion. An online between-subjects experiment compares the tradeoffs of reviewing versus producing work using three different organization strategies: working individually, working as an interactive team, and aggregating individuals into nominal groups. The results show that workers who review others' work perform better on subsequent tasks than workers who just produce. We also find that interactive reviewer teams outperform individual reviewers on all quality measures. However, aggregating individual reviewers into nominal groups produces better quality assessments than interactive teams, except in task domains where discussion helps overcome individual misconceptions.

Original languageEnglish (US)
Title of host publicationCSCW 2014 - Proceedings of the 17th ACM Conference on Computer Supported Cooperative Work and Social Computing
PublisherAssociation for Computing Machinery
Pages1445-1455
Number of pages11
ISBN (Print)9781450325400
DOIs
StatePublished - 2014
Event17th ACM Conference on Computer Supported Cooperative Work and Social Computing, CSCW 2014 - Baltimore, MD, United States
Duration: Feb 15 2014Feb 19 2014

Publication series

NameProceedings of the ACM Conference on Computer Supported Cooperative Work, CSCW

Other

Other17th ACM Conference on Computer Supported Cooperative Work and Social Computing, CSCW 2014
CountryUnited States
CityBaltimore, MD
Period2/15/142/19/14

Keywords

  • Assessment
  • Crowdsourcing
  • Learning
  • Review

Fingerprint Dive into the research topics of 'Reviewing versus doing: Learning and performance in crowd assessment'. Together they form a unique fingerprint.

Cite this