Better Together? The Role of Explanations in Supporting Novices in Individual and Collective Deliberations about AI

Better Together? The Role of Explanations in Supporting Novices in Individual and Collective Deliberations about AI

Abstract

Deploying AI systems in public institutions can have far-reaching consequences for many people, making it a matter of public interest. Providing opportunities for stakeholders to come together, understand these systems, and debate their merits and harms is thus essential. Explainable AI often focuses on individuals, but deliberation benefits from group settings, which are underexplored. To address this gap, we present findings from an interview study with 8 focus groups and 12 individuals. Our findings provide insight into how explanations support AI novices in deliberating alone and in groups. Participants used modular explanations with four information categories to solve tasks and decide about an AI system's deployment. We found that the explanations supported groups in creating shared understanding and in finding arguments for and against the system's deployment. In comparison, individual participants engaged with explanations in more depth and performed better in the study tasks, but missed an exchange with others. Based on our findings, we provide suggestions on how explanations should be designed to work in group settings and describe their potential use in real-world contexts. With this, our contributions inform XAI research that aims to enable AI novices to understand and deliberate AI systems in the public sector.

Grafik Top
Authors
  • Schmude, Timothée
  • Koesten, Laura
  • Möller, Torsten
  • Tschiatschek, Sebastian
Grafik Top
Shortfacts
Category
Technical Report (Technical Report)
Divisions
Data Mining and Machine Learning
Visualization and Data Analysis
Subjects
Kuenstliche Intelligenz
Informatik in Beziehung zu Mensch und Gesellschaft
Volume
abs/24
Date
18 November 2024
Official URL
https://doi.org/10.48550/arXiv.2411.11449
Export
Grafik Top