Skip to Main content Skip to Navigation
Journal articles

with KNN-Based Composite Memory for Dialog

Abstract : Various machine learning tasks can benefit from access to external information of different modalities, such as text and images. Recent work has focused on learning architectures with large memories capable of storing this knowledge. We propose augmenting generative Transformer neural networks with KNN-based Information Fetching (KIF) modules. Each KIF module learns a read operation to access fixed external knowledge. We apply these modules to generative dialog modeling, a challenging task where information must be flexibly retrieved and incorporated to maintain the topic and flow of conversation. We demonstrate the effectiveness of our approach by identifying relevant knowledge required for knowledgeable but engaging dialog from Wikipedia, images, and human-written dialog utterances, and show that leveraging this retrieved information improves model performance, measured by automatic and human evaluation.
Document type :
Journal articles
Complete list of metadata

Cited literature [46 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-02999678
Contributor : Angela Fan <>
Submitted on : Wednesday, November 11, 2020 - 12:14:08 AM
Last modification on : Wednesday, June 9, 2021 - 10:00:31 AM
Long-term archiving on: : Friday, February 12, 2021 - 6:03:20 PM

File

2419-Fan-finalversion.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-02999678, version 1
  • ARXIV : 2004.12744

Citation

Angela Fan, Claire Gardent, Chloé Braud, Antoine Bordes. with KNN-Based Composite Memory for Dialog. Transactions of the Association for Computational Linguistics, The MIT Press, inPress. ⟨hal-02999678⟩

Share

Metrics

Record views

156

Files downloads

94