Skip to Main content Skip to Navigation
Conference papers

Label-Dependencies Aware Recurrent Neural Networks

Abstract : In the last few years, Recurrent Neural Networks (RNNs) have proved effective on several NLP tasks. Despite such great success, their ability to model sequence labeling is still limited. This lead research toward solutions where RNNs are combined with models which already proved effective in this domain, such as CRFs. In this work we propose a solution far simpler but very effective: an evolution of the simple Jordan RNN, where labels are re-injected as input into the network, and converted into embeddings, in the same way as words. We compare this RNN variant to all the other RNN models, Elman and Jordan RNN, LSTM and GRU, on two well-known tasks of Spoken Language Understanding (SLU). Thanks to label embeddings and their combination at the hidden layer, the proposed variant, which uses more parameters than Elman and Jordan RNNs, but far fewer than LSTM and GRU, is more effective than other RNNs, but also outperforms sophisticated CRF models.
Complete list of metadatas

Cited literature [47 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-01579071
Contributor : Marco Dinarelli <>
Submitted on : Wednesday, August 30, 2017 - 12:34:03 PM
Last modification on : Thursday, September 5, 2019 - 11:50:02 AM

File

2017Cicling_NewRNN_forarXiv.pd...
Files produced by the author(s)

Identifiers

  • HAL Id : hal-01579071, version 1

Collections

Citation

Yoann Dupont, Marco Dinarelli, Isabelle Tellier. Label-Dependencies Aware Recurrent Neural Networks. Intelligent Text Processing and Computational Linguistics (CICling), Apr 2017, Budapest, Hungary. ⟨hal-01579071⟩

Share

Metrics

Record views

157

Files downloads

125