Mark my Word: A Sequence-to-Sequence Approach to Definition Modeling - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue Proceedings of the First NLPL Workshop on Deep Learning for Natural Language Processing Année : 2019

Mark my Word: A Sequence-to-Sequence Approach to Definition Modeling

Résumé

Defining words in a textual context is a useful task both for practical purposes and for gaining insight into distributed word representations. Building on the distribu-tional hypothesis, we argue here that the most natural formalization of definition modeling is to treat it as a sequence-to-sequence task, rather than a word-to-sequence task: given an input sequence with a highlighted word, generate a con-textually appropriate definition for it. We implement this approach in a Transformer-based sequence-to-sequence model. Our proposal allows to train contextualization and definition generation in an end-to-end fashion, which is a conceptual improvement over earlier works. We achieve state-of-the-art results both in contextual and non-contextual definition modeling.
Fichier principal
Vignette du fichier
ecp19163001.pdf (212 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-02362397 , version 1 (13-11-2019)

Identifiants

Citer

Timothee Mickus, Denis Paperno, Mathieu Constant. Mark my Word: A Sequence-to-Sequence Approach to Definition Modeling. Proceedings of the First NLPL Workshop on Deep Learning for Natural Language Processing, 2019, ⟨10.48550/arXiv.1911.05715⟩. ⟨hal-02362397⟩
62 Consultations
25 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More