Service interruption on Monday 11 July from 12:30 to 13:00: all the sites of the CCSD (HAL, EpiSciences, SciencesConf, AureHAL) will be inaccessible (network hardware connection).
Skip to Main content Skip to Navigation
Conference papers

Exploring the use of Attention-Based Recurrent Neural Networks For Spoken Language Understanding

Abstract : This study explores the use of a bidirectional recurrent neural network (RNN) encoder/decoder based on a mechanism of attention for a Spoken Language Understanding (SLU) task. First experiments carried on the ATIS corpus confirm the quality of the RNN baseline system used in this paper, by comparing its results on the ATIS corpus to the results recently published in the literature. Additional experiments show that RNN based on a mechanism of attention performs better than RNN architectures recently proposed for a slot filling task. On the French MEDIA corpus, a French state-of-the-art corpus for SLU dedicated to hotel reservation and tourist information, experiments show that a bidirectionnal RNN reaches a f-measure value of 79.51 while the use of a mechanism of attention allows us to reach a f-measure value of 80.27.
Document type :
Conference papers
Complete list of metadata

Cited literature [13 references]  Display  Hide  Download
Contributor : sylvain meignier Connect in order to contact the contributor
Submitted on : Monday, November 9, 2020 - 8:14:56 PM
Last modification on : Wednesday, November 25, 2020 - 5:50:05 PM
Long-term archiving on: : Wednesday, February 10, 2021 - 7:59:26 PM


Files produced by the author(s)


  • HAL Id : hal-01433202, version 1



Edwin Simonnet, Nathalie Camelin, Paul Deléglise, yannick Estève. Exploring the use of Attention-Based Recurrent Neural Networks For Spoken Language Understanding. Machine Learning for Spoken Language Understanding and Interaction NIPS 2015 workshop (SLUNIPS 2015), 2015, Montreal, Canada. ⟨hal-01433202⟩



Record views


Files downloads