The Effects of Crowd Worker Biases in Fact-Checking Tasks

Tim Draws, David La Barbera, Michael Soprano, Kevin Roitero, Davide Ceolin, Alessandro Checco, Stefano Mizzaro

Research output: Chapter in Book/Conference proceedings/Edited volumeConference contributionScientificpeer-review

9 Citations (Scopus)
17 Downloads (Pure)


Due to the increasing amount of information shared online every day, the need for sound and reliable ways of distinguishing between trustworthy and non-trustworthy information is as present as ever. One technique for performing fact-checking at scale is to employ human intelligence in the form of crowd workers. Although earlier work has suggested that crowd workers can reliably identify misinformation, cognitive biases of crowd workers may reduce the quality of truthfulness judgments in this context. We performed a systematic exploratory analysis of publicly available crowdsourced data to identify a set of potential systematic biases that may occur when crowd workers perform fact-checking tasks. Following this exploratory study, we collected a novel data set of crowdsourced truthfulness judgments to validate our hypotheses. Our findings suggest that workers generally overestimate the truthfulness of statements and that different individual characteristics (i.e., their belief in science) and cognitive biases (i.e., the affect heuristic and overconfidence) can affect their annotations. Interestingly, we find that, depending on the general judgment tendencies of workers, their biases may sometimes lead to more accurate judgments.

Original languageEnglish
Title of host publicationProceedings of 2022 5th ACM Conference on Fairness, Accountability, and Transparency, FAccT 2022
PublisherAssociation for Computing Machinery (ACM)
Number of pages11
ISBN (Electronic)978-1-4503-9352-2
Publication statusPublished - 2022
Event5th ACM Conference on Fairness, Accountability, and Transparency, FAccT 2022 - Virtual, Online, Korea, Republic of
Duration: 21 Jun 202224 Jun 2022

Publication series

NameACM International Conference Proceeding Series


Conference5th ACM Conference on Fairness, Accountability, and Transparency, FAccT 2022
Country/TerritoryKorea, Republic of
CityVirtual, Online

Bibliographical note

Green Open Access added to TU Delft Institutional Repository 'You share, we take care!' - Taverne project
Otherwise as indicated in the copyright section: the publisher is the copyright holder of this work and the author uses the Dutch legislation to make this work public.


  • Bias
  • Crowdsourcing
  • Explainability
  • Misinformation
  • Truthfulness


Dive into the research topics of 'The Effects of Crowd Worker Biases in Fact-Checking Tasks'. Together they form a unique fingerprint.

Cite this