Tešić, Marko and Hahn, Ulrike (2022) Can counterfactual explanations of AI systems’ predictions skew lay users’ causal intuitions about the world? If so, can we correct for that? Patterns 3 (12), p. 100635. ISSN 2666-3899.
|
Text
1-s2.0-S2666389922002677-main.pdf - Published Version of Record Available under License Creative Commons Attribution. Download (13MB) | Preview |
Abstract
Counterfactual (CF) explanations have been employed as one of the modes of explainability in explainable artificial intelligence (AI)—both to increase the transparency of AI systems and to provide recourse. Cognitive science and psychology have pointed out that people regularly use CFs to express causal relationships. Most AI systems, however, are only able to capture associations or correlations in data, so interpreting them as casual would not be justified. In this perspective, we present two experiments (total n = 364) exploring the effects of CF explanations of AI systems’ predictions on lay people’s causal beliefs about the real world. In Experiment 1, we found that providing CF explanations of an AI system’s predictions does indeed (unjustifiably) affect people’s causal beliefs regarding factors/features the AI uses and that people are more likely to view them as causal factors in the real world. Inspired by the literature on misinformation and health warning messaging, Experiment 2 tested whether we can correct for the unjustified change in causal beliefs. We found that pointing out that AI systems capture correlations and not necessarily causal relationships can attenuate the effects of CF explanations on people’s causal beliefs.
Metadata
Item Type: | Article |
---|---|
School: | Birkbeck Faculties and Schools > Faculty of Science > School of Psychological Sciences |
Research Centres and Institutes: | Cognition, Computation and Modelling, Centre for |
Depositing User: | Marko Tesic |
Date Deposited: | 15 Dec 2022 06:48 |
Last Modified: | 02 Aug 2023 18:19 |
URI: | https://eprints.bbk.ac.uk/id/eprint/50231 |
Statistics
Additional statistics are available via IRStats2.