New Recurrent Neural Network Variants for Sequence Labeling - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Lecture Notes in Computer Science Année : 2016

New Recurrent Neural Network Variants for Sequence Labeling

Résumé

In this paper we study different architectures of Recurrent Neural Networks (RNN) for sequence labeling tasks. We propose two new variants of RNN and we compare them to the more traditional RNN architectures of Elman and Jordan. We explain in details the advantages of these new variants of RNNs with respect to Elman's and Jordan's RNN. We evaluate all models, either new or traditional, on three different tasks: POS-tagging of the French Treebank, and two tasks of Spoken Language Understanding (SLU), namely ATIS and MEDIA. The results we obtain clearly show that the new variants of RNN are more effective than the traditional ones.
Fichier principal
Vignette du fichier
2016Cicling_NewRNN_author-final.pdf (310.33 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-01489955 , version 1 (14-03-2017)

Identifiants

  • HAL Id : hal-01489955 , version 1

Citer

Marco Dinarelli, Isabelle Tellier. New Recurrent Neural Network Variants for Sequence Labeling. 17th International Conference on Intelligent Text Processing and Computational Linguistics, Apr 2016, Konya, Turkey. ⟨hal-01489955⟩
238 Consultations
563 Téléchargements

Partager

Gmail Facebook X LinkedIn More