Abstract
Crowdsourcing has established itself as a powerful tool for multimedia researchers, and is commonly used to collect human input for various purposes. It is also a fairly widespread practice to control the contributions of users based on the quality of their input. This paper points to the fact that applying this practice in subjective assessment tasks may lead to an undesired, negative outcome. We present a crowdsourcing experiment and a discussion of the ways in which control in crowdsourcing studies can lead to a phenomenon akin to a self-fulfilling prophecy. This paper is intended to trigger discussion and lead to more deeply reflective crowdsourcing practices in the multimedia context.
Original language | English |
---|---|
Title of host publication | 2016 14th International Workshop on Content-Based Multimedia Indexing (CBMI) |
Publisher | IEEE |
Pages | 1-6 |
Number of pages | 6 |
ISBN (Electronic) | 978-1-4673-8695-1 |
DOIs | |
Publication status | Published - 2016 |
Event | 2016 14th International Workshop on Content-Based Multimedia Indexing - Bucharest, Romania Duration: 15 Jun 2016 → 17 Jun 2016 http://cbmi2016.upb.ro/ |
Conference
Conference | 2016 14th International Workshop on Content-Based Multimedia Indexing |
---|---|
Abbreviated title | CBMI |
Country/Territory | Romania |
City | Bucharest |
Period | 15/06/16 → 17/06/16 |
Internet address |
Keywords
- Crowdsourcing
- Multimedia communication
- Quality of service
- Cameras
- Distortion
- Standards
- Physiology