Skip to Main content Skip to Navigation
Conference papers

A Study of the Plausibility of Attention between RNN Encoders in Natural Language Inference

Duc Hau Nguyen 1 Guillaume Gravier 1 Pascale Sébillot 1 
1 LinkMedia - Creating and exploiting explicit links between multimedia fragments
Inria Rennes – Bretagne Atlantique , IRISA-D6 - MEDIA ET INTERACTIONS
Abstract : Attention maps in neural models for NLP are appealing to explain the decision made by a model, hopefully emphasizing words that justify the decision. While many empirical studies hint that attention maps can provide such justification from the analysis of sound examples, only a few assess the plausibility of explanations based on attention maps, i.e., the usefulness of attention maps for humans to understand the decision. These studies furthermore focus on text classification. In this paper, we report on a preliminary assessment of attention maps in a sentence comparison task, namely natural language inference. We compare the cross-attention weights between two RNN encoders with human-based and heuristic-based annotations on the eSNLI corpus. We show that the heuristic reasonably correlates with human annotations and can thus facilitate evaluation of plausible explanations in sentence comparison tasks. Raw attention weights however remain only loosely related to a plausible explanation.
Complete list of metadata
Contributor : Guillaume Gravier Connect in order to contact the contributor
Submitted on : Monday, October 11, 2021 - 8:56:33 AM
Last modification on : Friday, August 5, 2022 - 2:54:52 PM
Long-term archiving on: : Wednesday, January 12, 2022 - 7:00:08 PM


Files produced by the author(s)


  • HAL Id : hal-03372669, version 1


Duc Hau Nguyen, Guillaume Gravier, Pascale Sébillot. A Study of the Plausibility of Attention between RNN Encoders in Natural Language Inference. ICMLA 2021 - 20th IEEE International Conference on Machine Learning and Applications, Dec 2021, Pasadena, United States. pp.1-7. ⟨hal-03372669⟩



Record views


Files downloads