Hear Me Out: A Study on the Use of the Voice Modality for Crowdsourced Relevance Assessments

Nirmal Roy, Agathe Balayn, David Maxwell, Claudia Hauff

Research output: Chapter in Book/Conference proceedings/Edited volumeConference contributionScientificpeer-review

33 Downloads (Pure)

Abstract

The creation of relevance assessments by human assessors (often nowadays crowdworkers) is a vital step when building IR test collections. Prior works have investigated assessor quality & behaviour, and tooling to support assessors in their task. We have few insights though into the impact of a document's presentation modality on assessor efficiency and effectiveness. Given the rise of voice-based interfaces, we investigate whether it is feasible for assessors to judge the relevance of text documents via a voice-based interface. We ran a user study (n = 49) on a crowdsourcing platform where participants judged the relevance of short and long documents-sampled from the TREC Deep Learning corpus-presented to them either in the text or voice modality. We found that: (i) participants are equally accurate in their judgements across both the text and voice modality; (ii) with increased document length it takes participants significantly longer (for documents of length > 120 words it takes almost twice as much time) to make relevance judgements in the voice condition; and (iii) the ability of assessors to ignore stimuli that are not relevant (i.e., inhibition) impacts the assessment quality in the voice modality-assessors with higher inhibition are significantly more accurate than those with lower inhibition. Our results indicate that we can reliably leverage the voice modality as a means to effectively collect relevance labels from crowdworkers.

Original languageEnglish
Title of host publicationSIGIR 2023 - Proceedings of the 46th International ACM SIGIR Conference on Research and Development in Information Retrieval
PublisherAssociation for Computing Machinery (ACM)
Pages718-728
Number of pages11
ISBN (Electronic)9781450394086
DOIs
Publication statusPublished - 2023
Event46th International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2023 - Taipei, Taiwan
Duration: 23 Jul 202327 Jul 2023

Publication series

NameSIGIR 2023 - Proceedings of the 46th International ACM SIGIR Conference on Research and Development in Information Retrieval

Conference

Conference46th International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2023
Country/TerritoryTaiwan
CityTaipei
Period23/07/2327/07/23

Keywords

  • Cognitive Ability
  • Crowdsourcing
  • Data Annotation
  • Relevance Assessment
  • User Interfaces

Fingerprint

Dive into the research topics of 'Hear Me Out: A Study on the Use of the Voice Modality for Crowdsourced Relevance Assessments'. Together they form a unique fingerprint.

Cite this