Abstract
Recent research has demonstrated that cognitive biases such as the confirmation bias or the anchoring effect can negatively affect the quality of crowdsourced data. In practice, however, such biases go unnoticed unless specifically assessed or controlled for. Task requesters need to ensure that task workflow and design choices do not trigger workers’ cognitive biases. Moreover, to facilitate the reuse of crowdsourced data collections, practitioners can benefit from understanding whether and which cognitive biases may be associated with the data. To this end, we propose a 12-item checklist adapted from business psychology to combat cognitive biases in crowdsourcing. We demonstrate the practical application of this checklist in a case study on viewpoint annotations for search results. Through a retrospective analysis of relevant crowdsourcing research that has been published at HCOMP in 2018, 2019, and 2020, we show that cognitive biases may often affect crowd workers but are typically not considered as potential sources of poor data quality. The checklist we propose is a practical tool that requesters can use to improve their task designs and appropriately describe potential limitations of collected data. It contributes to a body of efforts towards making human-labeled data more reliable and reusable.
Original language | English |
---|---|
Title of host publication | Proceedings of the AAAI Conference on Human Computation and Crowdsourcing |
Editors | Ece Kamar, Kurt Luther |
Publisher | Association for the Advancement of Artificial Intelligence (AAAI) |
Pages | 48-59 |
Number of pages | 12 |
Volume | 9 |
ISBN (Print) | 978-1-57735-872-5 |
Publication status | Published - 2021 |
Event | The Ninth AAAI Conference on Human Computation and Crowdsourcing - Virtual conference Duration: 14 Nov 2021 → 18 Nov 2021 Conference number: 9th |
Conference
Conference | The Ninth AAAI Conference on Human Computation and Crowdsourcing |
---|---|
Abbreviated title | HCOMP 2021 |
City | Virtual conference |
Period | 14/11/21 → 18/11/21 |
Bibliographical note
Green Open Access added to TU Delft Institutional Repository 'You share, we take care!' - Taverne project https://www.openaccess.nl/en/you-share-we-take-careOtherwise as indicated in the copyright section: the publisher is the copyright holder of this work and the author uses the Dutch legislation to make this work public.
Keywords
- Crowdsourcing
- Human-labeled Data
- Subjective Judgments
- Cognitive Bias
- Data Quality