Riemannian metrics for neural networks II: Recurrent networks and learning symbolic data sequences

Yann Ollivier 1, 2
2 TAO - Machine Learning and Optimisation
LRI - Laboratoire de Recherche en Informatique, UP11 - Université Paris-Sud - Paris 11, Inria Saclay - Ile de France, CNRS - Centre National de la Recherche Scientifique : UMR8623
Abstract : Recurrent neural networks are powerful models for sequential data, able to represent complex dependencies in the sequence that simpler models such as hidden Markov models cannot handle. Yet they are notoriously hard to train. Here we introduce a training procedure using a gradient ascent in a Riemannian metric: this produces an algorithm independent from design choices such as the encoding of parameters and unit activities. This metric gradient ascent is designed to have an algorithmic cost close to backpropagation through time for sparsely connected networks. We use this procedure on \emph{gated leaky neural networks} (GLNNs), a variant of recurrent neural networks with an architecture inspired by finite automata and an evolution equation inspired by continuous-time networks. GLNNs trained with a Riemannian gradient are demonstrated to effectively capture a variety of structures in synthetic problems: basic block nesting as in context-free grammars (an important feature of natural languages, but difficult to learn), intersections of multiple independent Markov-type relations, or long-distance relationships such as the distant-XOR problem. This method does not require adjusting the network structure or initial parameters: the network used is a sparse random graph and the initialization is identical for all problems considered.
Type de document :
Article dans une revue
Information and Inference, Oxford University Press (OUP), 2015, 4 (2), pp.154-193. 〈http://imaiai.oxfordjournals.org/content/4/2/154〉. 〈10.1093/imaiai/iav007〉
Liste complète des métadonnées

https://hal.archives-ouvertes.fr/hal-00857980
Contributeur : Yann Ollivier <>
Soumis le : mercredi 4 septembre 2013 - 13:32:37
Dernière modification le : jeudi 5 avril 2018 - 12:30:12

Lien texte intégral

Identifiants

Citation

Yann Ollivier. Riemannian metrics for neural networks II: Recurrent networks and learning symbolic data sequences. Information and Inference, Oxford University Press (OUP), 2015, 4 (2), pp.154-193. 〈http://imaiai.oxfordjournals.org/content/4/2/154〉. 〈10.1093/imaiai/iav007〉. 〈hal-00857980〉

Partager

Métriques

Consultations de la notice

247