Skip to Main content Skip to Navigation
Conference papers

Exploring the use of Attention-Based Recurrent Neural Networks For Spoken Language Understanding

Abstract : This study explores the use of a bidirectional recurrent neural network (RNN) encoder/decoder based on a mechanism of attention for a Spoken Language Understanding (SLU) task. First experiments carried on the ATIS corpus confirm the quality of the RNN baseline system used in this paper, by comparing its results on the ATIS corpus to the results recently published in the literature. Additional experiments show that RNN based on a mechanism of attention performs better than RNN architectures recently proposed for a slot filling task. On the French MEDIA corpus, a French state-of-the-art corpus for SLU dedicated to hotel reservation and tourist information, experiments show that a bidirectionnal RNN reaches a f-measure value of 79.51 while the use of a mechanism of attention allows us to reach a f-measure value of 80.27.
Document type :
Conference papers
Complete list of metadata

Cited literature [13 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-01433202
Contributor : Sylvain Meignier <>
Submitted on : Monday, November 9, 2020 - 8:14:56 PM
Last modification on : Wednesday, November 25, 2020 - 5:50:05 PM
Long-term archiving on: : Wednesday, February 10, 2021 - 7:59:26 PM

File

nips15.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-01433202, version 1

Collections

Citation

Edwin Simonnet, Nathalie Camelin, Paul Deléglise, Yannick Estève. Exploring the use of Attention-Based Recurrent Neural Networks For Spoken Language Understanding. Machine Learning for Spoken Language Understanding and Interaction NIPS 2015 workshop (SLUNIPS 2015), 2015, Montreal, Canada. ⟨hal-01433202⟩

Share

Metrics

Record views

483

Files downloads

38