Abstract
Crowdsourcing has emerged as an effective method of scaling-up tasks previously reserved for a small set of experts. Accordingly, researchers in the large-scale online learning space have begun to employ crowdworkers to conduct research about large-scale, open online learning. We here report results from a crowdsourcing study (N=135) to evaluate the extent to which crowdworkers and MOOC learners behave comparably on lecture viewing and quiz tasks---the most utilized learning activities in MOOCs. This serves to (i) validate the assumption of previous research that crowdworkers are indeed reliable proxies of online learners and (ii) address the potential of employing crowdworkers as a means of online learning environment testing. Overall, we observe mixed results---in certain contexts (quiz performance and video watching behavior) crowdworkers appear to behave comparably to MOOC learners, and in other situations (interactions with in-video quizzes), their behaviors appear to be disparate. We conclude that future research should be cautious if employing crowdworkers to carry out learning tasks, as the two populations do not behave comparably on all learning-related activities.
Original language | English |
---|---|
Article number | 42 |
Pages (from-to) | 1-16 |
Number of pages | 16 |
Journal | ACM Proceedings on Human-Computer Interaction |
Volume | 2 |
Issue number | CSCW |
DOIs | |
Publication status | Published - 2018 |
Bibliographical note
Green Open Access added to TU Delft Institutional Repository ‘You share, we take care!’ – Taverne project https://www.openaccess.nl/en/you-share-we-take-careOtherwise as indicated in the copyright section: the publisher is the copyright holder of this work and the author uses the Dutch legislation to make this work public.
Keywords
- Learning Analytics
- MOOCs
- Replication
- Crowdwork