Skip to Main content Skip to Navigation
Conference papers

New Recurrent Neural Network Variants for Sequence Labeling

Abstract : In this paper we study different architectures of Recurrent Neural Networks (RNN) for sequence labeling tasks. We propose two new variants of RNN and we compare them to the more traditional RNN architectures of Elman and Jordan. We explain in details the advantages of these new variants of RNNs with respect to Elman's and Jordan's RNN. We evaluate all models, either new or traditional, on three different tasks: POS-tagging of the French Treebank, and two tasks of Spoken Language Understanding (SLU), namely ATIS and MEDIA. The results we obtain clearly show that the new variants of RNN are more effective than the traditional ones.
Document type :
Conference papers
Complete list of metadata

Cited literature [26 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-01489955
Contributor : Marco Dinarelli <>
Submitted on : Tuesday, March 14, 2017 - 5:02:46 PM
Last modification on : Tuesday, January 5, 2021 - 5:28:07 PM
Long-term archiving on: : Thursday, June 15, 2017 - 3:11:32 PM

File

2016Cicling_NewRNN_author-fina...
Files produced by the author(s)

Identifiers

  • HAL Id : hal-01489955, version 1

Collections

Citation

Marco Dinarelli, Isabelle Tellier. New Recurrent Neural Network Variants for Sequence Labeling. 17th International Conference on Intelligent Text Processing and Computational Linguistics, Apr 2016, Konya, Turkey. ⟨hal-01489955⟩

Share

Metrics

Record views

330

Files downloads

787