Understanding the Role of Explanation Modality in AI-assisted Decision-making

Vincent Robbemond, Oana Inel, Ujwal Gadiraju

Research output: Chapter in Book/Conference proceedings/Edited volumeConference contributionScientificpeer-review

1 Citation (Scopus)
50 Downloads (Pure)

Abstract

Advances in artificial intelligence and machine learning have led to a steep rise in the adoption of AI to augment or support human decision-making across domains. There has been an increasing body of work addressing the benefits of model interpretability and explanations to help end-users or other stakeholders decipher the inner workings of the so-called "black box AI systems". Yet, little is currently understood about the role of modalities through which explanations can be communicated (e.g., text, visualizations, or audio) to inform, augment, and shape human decision-making. In our work, we address this research gap through the lens of a credibility assessment system. Considering the deluge of information available through various channels, people constantly make decisions while considering the perceived credibility of the information they consume. However, with an increasing information overload, assessing the credibility of the information we encounter is a non-trivial task. To help users in this task, automated credibility assessment systems have been devised as decision support systems in various contexts (e.g., assessing the credibility of news or social media posts). However, for these systems to be effective in supporting users, they need to be trusted and understood. Explanations have been shown to play an essential role in informing users' reliance on decision support systems. In this paper, we investigate the influence of explanation modalities on an AI-assisted credibility assessment task. We use a between-subjects experiment (N = 375), spanning six different explanation modalities, to evaluate the role of explanation modality on the accuracy of AI-assisted decision outcomes, the perceived system trust among users, and system usability. Our results indicate that explanations play a significant role in shaping users' reliance on the decision support system and, thereby, the accuracy of decisions made. We found that users performed with higher accuracy while assessing the credibility of statements in the presence of explanations. We also found that users had a significantly harder time agreeing on statement credibility without explanations. With explanations present, text and audio explanations were more effective than graphic explanations. Additionally, we found that combining graphical with text and/or audio explanations were significantly effective. Such combinations of modalities led to a higher user performance than using graphical explanations alone.

Original languageEnglish
Title of host publicationUMAP2022 - Proceedings of the 30th ACM Conference on User Modeling, Adaptation and Personalization
PublisherAssociation for Computing Machinery (ACM)
Pages223-233
Number of pages11
ISBN (Electronic)978-1-4503-9207-5
DOIs
Publication statusPublished - 2022
Event30th ACM Conference on User Modeling, Adaptation and Personalization, UMAP2022 - Virtual, Online, Spain
Duration: 4 Jul 20227 Jul 2022

Publication series

NameUMAP2022 - Proceedings of the 30th ACM Conference on User Modeling, Adaptation and Personalization

Conference

Conference30th ACM Conference on User Modeling, Adaptation and Personalization, UMAP2022
Country/TerritorySpain
CityVirtual, Online
Period4/07/227/07/22

Fingerprint

Dive into the research topics of 'Understanding the Role of Explanation Modality in AI-assisted Decision-making'. Together they form a unique fingerprint.

Cite this