Microtask crowdsourcing for music score Transcriptions: an experiment with error detection

Research output: Chapter in Book/Conference proceedings/Edited volumeConference contributionScientificpeer-review

6 Downloads (Pure)

Abstract

Human annotation is still an essential part of modern transcription workflows for digitizing music scores, either as a standalone approach where a single expert annotator transcribes a complete score, or for supporting an automated Optical Music Recognition (OMR) system. Research on human computation has shown the effectiveness of crowdsourcing for scaling out human work by defining a large number of microtasks which can easily be distributed and executed. However, microtask design for music transcription is a research area that remains unaddressed. This paper focuses on the design of a crowdsourcing task to detect errors in a score transcription which can be deployed in either automated or human-driven transcription workflows. We conduct an experiment where we study two design parameters: 1) the size of the score to be annotated and 2) the modality in which it is presented in the user interface. We analyze the performance and reliability of non-specialised crowdworkers on Amazon Mechanical Turk with respect to these design parameters, differentiated by worker experience and types of transcription errors. Results are encouraging, and pave the way for scalable and efficient crowdassisted music transcription systems.
Original languageEnglish
Title of host publicationProceedings of the 21st International Society for Music Information Retrieval Conference
Number of pages7
ISBN (Electronic)978-0-9813537-0-8
Publication statusPublished - 2020
Event21st International Society for Music Information Retrieval Conference -
Duration: 11 Oct 202015 Oct 2020
Conference number: 21
https://program.ismir2020.net

Conference

Conference21st International Society for Music Information Retrieval Conference
Abbreviated titleISMIR 2020
Period11/10/2015/10/20
OtherVirtual/online event due to COVID-19
Internet address

Fingerprint Dive into the research topics of 'Microtask crowdsourcing for music score Transcriptions: an experiment with error detection'. Together they form a unique fingerprint.

Cite this