Skip to Main content Skip to Navigation
Conference papers

How Good is your Explanation? Algorithmic Stability Measures to Assess the Quality of Explanations for Deep Neural Networks

Abstract : A plethora of methods have been proposed to explain how deep neural networks reach their decisions but comparatively, little effort has been made to ensure that the explanations produced by these methods are objectively relevant. While several desirable properties for trustworthy explanations have been formulated, objective measures have been harder to derive. Here, we propose two new measures to evaluate explanations borrowed from the field of algorithmic stability: mean generalizability MeGe and relative consistency ReCo. We conduct extensive experiments on different network architectures, common explainability methods, and several image datasets to demonstrate the benefits of the proposed measures. In comparison to ours, popular fidelity measures are not sufficient to guarantee trustworthy explanations. Finally, we found that 1-Lipschitz networks produce explanations with higher MeGe and ReCo than common neural networks while reaching similar accuracy. This suggests that 1-Lipschitz networks are a relevant direction towards predictors that are more explainable and trustworthy.
Document type :
Conference papers
Complete list of metadata

https://hal.archives-ouvertes.fr/hal-02930949
Contributor : Thomas Fel Connect in order to contact the contributor
Submitted on : Monday, November 8, 2021 - 6:37:06 PM
Last modification on : Wednesday, November 17, 2021 - 12:30:02 AM

Files

main.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-02930949, version 3
  • ARXIV : 2009.04521

Collections

Citation

Thomas Fel, David Vigouroux, Rémi Cadène, Thomas Serre. How Good is your Explanation? Algorithmic Stability Measures to Assess the Quality of Explanations for Deep Neural Networks. 2022 CVF Winter Conference on Applications of Computer Vision (WACV), Jan 2022, Hawaii, United States. ⟨hal-02930949v3⟩

Share

Metrics

Record views

52

Files downloads

36