@brusilovsky

Understanding the Role of Explanation Modality in AI-assisted Decision-making

, , and . Proceedings of the 30th ACM Conference on User Modeling, Adaptation and Personalization, page 223-233. ACM, (July 2022)
DOI: 10.1145/3503252.3531311

Abstract

Advances in artificial intelligence and machine learning have led to a steep rise in the adoption of AI to augment or support human decision-making across domains. There has been an increasing body of work addressing the benefits of model interpretability and explanations to help end-users or other stakeholders decipher the inner workings of the so-called ”black box AI systems”. Yet, little is currently understood about the role of modalities through which explanations can be communicated (e.g., text, visualizations, or audio) to inform, augment, and shape human decision-making. In our work, we address this research gap through the lens of a credibility assessment system. Considering the deluge of information available through various channels, people constantly make decisions while considering the perceived credibility of the information they consume. However, with an increasing information overload, assessing the credibility of the information we encounter is a non-trivial task. To help users in this task, automated credibility assessment systems have been devised as decision support systems in various contexts (e.g., assessing the credibility of news or social media posts). However, for these systems to be effective in supporting users, they need to be trusted and understood. Explanations have been shown to play an essential role in informing users’ reliance on decision support systems. In this paper, we investigate the influence of explanation modalities on an AI-assisted credibility assessment task. We use a between-subjects experiment (N = 375), spanning six different explanation modalities, to evaluate the role of explanation modality on the accuracy of AI-assisted decision outcomes, the perceived system trust among users, and system usability. Our results indicate that explanations play a significant role in shaping users’ reliance on the decision support system and, thereby, the accuracy of decisions made. We found that users performed with higher accuracy while assessing the credibility of statements in the presence of explanations. We also found that users had a significantly harder time agreeing on statement credibility without explanations. With explanations present, text and audio explanations were more effective than graphic explanations. Additionally, we found that combining graphical with text and/or audio explanations were significantly effective. Such combinations of modalities led to a higher user performance than using graphical explanations alone.

Description

Understanding the Role of Explanation Modality in AI-assisted Decision-making | Proceedings of the 30th ACM Conference on User Modeling, Adaptation and Personalization

Links and resources

Tags

community

  • @brusilovsky
  • @dblp
@brusilovsky's tags highlighted