Large Deviations of a Spatially-Stationary Network of Interacting Neurons

Abstract : In this work we determine a process-level Large Deviation Principle (LDP) for a model of interacting neurons indexed by a lattice Z d. The neurons are subject to noise, which is modelled as a correlated martingale. The probability law governing the noise is strictly stationary, and we are therefore able to find a LDP for the probability laws Π n governing the stationary empirical measurê µ n generated by the neurons in a cube of length (2n + 1). We use this LDP to determine an LDP for the neural network model. The connection weights between the neurons evolve according to a learning rule / neuronal plasticity, and these results are adaptable to a large variety of neural network models. This LDP is of great use in the mathematical modelling of neural networks, because it allows a quantification of the likelihood of the system deviating from its limit, and also a determination of which direction the system is likely to deviate. The work is also of interest because there are nontrivial correlations between the neurons even in the asymptotic limit, thereby presenting itself as a generalisation of traditional mean-field models.
Type de document :
Pré-publication, Document de travail
Liste complète des métadonnées

Littérature citée [62 références]  Voir  Masquer  Télécharger
Contributeur : Olivier Faugeras <>
Soumis le : jeudi 22 décembre 2016 - 08:35:30
Dernière modification le : mercredi 30 janvier 2019 - 14:28:04
Document(s) archivé(s) le : mardi 21 mars 2017 - 12:19:11


Fichiers produits par l'(les) auteur(s)


  • HAL Id : hal-01421319, version 1



Olivier Faugeras, James Maclaurin. Large Deviations of a Spatially-Stationary Network of Interacting Neurons. 2016. 〈hal-01421319〉



Consultations de la notice


Téléchargements de fichiers