On the Effects of Automatically Generated Adjunct Questions for Search as Learning

Peide Zhu, Arthur Câmara, Nirmal Roy, David Maxwell, Claudia Hauff

Research output: Chapter in Book/Conference proceedings/Edited volumeConference contributionScientificpeer-review

15 Downloads (Pure)

Abstract

Actively engaging learners with learning materials has been shown to be very important in the Search as Learning (SAL) setting. One active reading strategy relies on asking so-called adjunct questions, i.e., manually curated questions geared towards essential concepts of the target material. However, manual question creation is impractical given the vast online content. Recent research has explored the effects of Automatic Question Generation (AQG) on aiding human learning. These studies have primarily focused on user studies in controlled online reading scenarios with limited documents. However, the impacts of adjunct questions on learning in the SAL setting, which involves learning through web searching, are not yet well understood. This paper addresses this gap by conducting a user study with automatically generated adjunct questions integrated into the reading interface built on top of a search system. We conducted a between-subjects user study (N = 144) to investigate the incorporation of automatically generated adjunct questions on participants' learning. We employed three different question generation strategies as well as a control condition: (i) synthesis questions; (ii) factoid questions targeting random text spans; and (iii) factoid questions targeting terms and phrases relevant to the information need at hand. We present four major findings: (i) participants who received adjunct questions exhibited significantly more fine-grained reading behaviour, such as longer document dwell time and more scrolls, than those without adjunct questions. However, adjunct questions' influence on learning outcomes depends on the AQG strategy. (ii) Question types significantly influence participants' reading behaviour. (iii) The adjunct questions' target spans significantly influence learning outcomes. Lastly, (iv) participants' prior knowledge levels affect adjunct questions' effects on their learning outcomes and their reaction to different AQG strategies. Our findings have significant design implications for learning-oriented search systems. The data and code is available at https://github.com/zpeide/AQG-AdjunctQuestions.

Original languageEnglish
Title of host publicationCHIIR 2024 - Proceedings of the 2024 Conference on Human Information Interaction and Retrieval
PublisherAssociation for Computing Machinery (ACM)
Pages266-277
Number of pages12
ISBN (Electronic)9798400704345
DOIs
Publication statusPublished - 2024
Event2024 Conference on Human Information Interaction and Retrieval, CHIIR 2024 - Sheffield, United Kingdom
Duration: 10 Mar 202414 Mar 2024

Publication series

NameCHIIR 2024 - Proceedings of the 2024 Conference on Human Information Interaction and Retrieval

Conference

Conference2024 Conference on Human Information Interaction and Retrieval, CHIIR 2024
Country/TerritoryUnited Kingdom
CitySheffield
Period10/03/2414/03/24

Fingerprint

Dive into the research topics of 'On the Effects of Automatically Generated Adjunct Questions for Search as Learning'. Together they form a unique fingerprint.

Cite this