A geometrical analysis of global stability in trained feedback networks - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue Neural Computation Année : 2019

A geometrical analysis of global stability in trained feedback networks

Résumé

Recurrent neural networks have been extensively studied in the context of neuroscience and machine learning due to their ability to implement complex computations. While substantial progress in designing effective learning algorithms has been achieved in the last years, a full understanding of trained recurrent networks is still lacking. Specifically, the mechanisms that allow computations to emerge from the underlying recurrent dynamics are largely unknown. Here we focus on a simple, yet underexplored computational setup: a feedback architecture trained to associate a stationary output to a stationary input. As a starting point, we derive an approximate analytical description of global dynamics in trained networks which assumes uncorrelated connectivity weights in the feedback and in the random bulk. The resulting mean-field theory suggests that the task admits several classes of solutions, which imply different stability properties. Different classes are characterized in terms of the geometrical arrangement of the readout with respect to the input vectors, defined in the high-dimensional space spanned by the network population. We find that such approximate theoretical approach can be used to understand how standard training techniques implement the input-output task in finite-size feedback networks. In particular, our simplified description captures the local and the global stability properties of the target solution, and thus predicts training performance.

Domaines

Neurosciences
Fichier principal
Vignette du fichier
1809.02386.pdf (6.25 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-02413930 , version 1 (16-12-2019)

Identifiants

Citer

Francesca Mastrogiuseppe, Srdjan Ostojic. A geometrical analysis of global stability in trained feedback networks. Neural Computation, 2019, 31 (6), pp.1139-1182. ⟨10.1162/neco_a_01187⟩. ⟨hal-02413930⟩

Collections

ENS-PARIS PSL ANR
25 Consultations
24 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More