Skim-Attention: Learning to Focus via Document Layout - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2021

Skim-Attention: Learning to Focus via Document Layout

Laura Nguyen
  • Fonction : Auteur
  • PersonId : 1109224
Thomas Scialom
  • Fonction : Auteur
  • PersonId : 1109225
Jacopo Staiano
  • Fonction : Auteur
  • PersonId : 1109226

Résumé

Transformer-based pre-training techniques of text and layout have proven effective in a number of document understanding tasks. Despite this success, multimodal pre-training models suffer from very high computational and memory costs. Motivated by human reading strategies, this paper presents Skim-Attention, a new attention mechanism that takes advantage of the structure of the document and its layout. Skim-Attention only attends to the 2dimensional position of the words in a document. Our experiments show that Skim-Attention obtains a lower perplexity than prior works, while being more computationally efficient. Skim-Attention can be further combined with long-range Transformers to efficiently process long documents. We also show how Skim-Attention can be used off-the-shelf as a mask for any Pre-trained Language Model, allowing to improve their performance while restricting attention. Finally, we show the emergence of a document structure representation in Skim-Attention.
Fichier principal
Vignette du fichier
[HAL]skim-attention.pdf (1.85 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03333889 , version 1 (03-09-2021)

Identifiants

Citer

Laura Nguyen, Thomas Scialom, Jacopo Staiano, Benjamin Piwowarski. Skim-Attention: Learning to Focus via Document Layout. Findings of the 2021 Conference on Empirical Methods in Natural Language Processing (Findings of EMNLP 2021), Nov 2021, Punta Cana, Dominican Republic. ⟨hal-03333889⟩
154 Consultations
305 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More