A Checklist to Combat Cognitive Biases in Crowdsourcing

Research output: Chapter in Book/Conference proceedings/Edited volumeConference contributionScientificpeer-review

11 Downloads (Pure)

Abstract

Recent research has demonstrated that cognitive biases such as the confirmation bias or the anchoring effect can negatively affect the quality of crowdsourced data. In practice, however, such biases go unnoticed unless specifically assessed or controlled for. Task requesters need to ensure that task workflow and design choices do not trigger workers’ cognitive biases. Moreover, to facilitate the reuse of crowdsourced data collections, practitioners can benefit from understanding whether and which cognitive biases may be associated with the data. To this end, we propose a 12-item checklist adapted from business psychology to combat cognitive biases in crowdsourcing. We demonstrate the practical application of this checklist in a case study on viewpoint annotations for search results. Through a retrospective analysis of relevant crowdsourcing research that has been published at HCOMP in 2018, 2019, and 2020, we show that cognitive biases may often affect crowd workers but are typically not considered as potential sources of poor data quality. The checklist we propose is a practical tool that requesters can use to improve their task designs and appropriately describe potential limitations of collected data. It contributes to a body of efforts towards making human-labeled data more reliable and reusable.
Original languageEnglish
Title of host publicationProceedings of the AAAI Conference on Human Computation and Crowdsourcing
EditorsEce Kamar, Kurt Luther
PublisherAssociation for the Advancement of Artificial Intelligence (AAAI)
Pages48-59
Number of pages12
Volume9
ISBN (Print)978-1-57735-872-5
Publication statusPublished - 2021
EventThe Ninth AAAI Conference on Human Computation and Crowdsourcing - Virtual conference
Duration: 14 Nov 202118 Nov 2021
Conference number: 9th

Conference

ConferenceThe Ninth AAAI Conference on Human Computation and Crowdsourcing
Abbreviated titleHCOMP 2021
CityVirtual conference
Period14/11/2118/11/21

Bibliographical note

Green Open Access added to TU Delft Institutional Repository 'You share, we take care!' - Taverne project https://www.openaccess.nl/en/you-share-we-take-care

Otherwise as indicated in the copyright section: the publisher is the copyright holder of this work and the author uses the Dutch legislation to make this work public.

Keywords

  • Crowdsourcing
  • Human-labeled Data
  • Subjective Judgments
  • Cognitive Bias
  • Data Quality

Fingerprint

Dive into the research topics of 'A Checklist to Combat Cognitive Biases in Crowdsourcing'. Together they form a unique fingerprint.

Cite this