Increase in Complexity in Random Neural Networks

B. Cessac
Abstract : We study the dynamics of a discrete time, continuous state neural network with random asymmetric couplings and random thresholds. The evolution of the neurons is given in the thermodynamic limit by a set of dynamic mean-field equations obtained by using a local chaos hypothesis. We study the evolution of the mean quadratic distance between two trajectories, and show there exist two different regimes according to the value of the control parameters. In the first one (static regime) two initially close trajectories evolve to the same fixed point, while, in the second one, (chaotic regime) they diverge with an exponential rate, and evolve to a constant, non zero distance. The critical condition for the transition is obtained in a general frame, but, in a specific case, we recover the equation for the De Almeida-Thouless line suggesting strong analogy with the SK model. Besides, the limit for the quadratic distance is the same for all initial conditions choice, showing that ultrametricity occurs in our model. However, we show numerically that this property is not associated to a complex breaking up of the phase space like in the SK model. Besides, the quenched stochastic process giving the evolution of the neurons is a white noise in the thermodynamic limit. The behaviour of our model when crossing the AT line can be characterized by studying the Kolmogorov-Sinai entropy, which exhibits a sharp transition in the thermodynamic limit. This entropy is zero in the static phase, while it becomes infinite in the chaotic regime.
Document type :
Journal articles
Complete list of metadatas

https://hal.archives-ouvertes.fr/jpa-00247065
Contributor : Archives Journal de Physique <>
Submitted on : Sunday, January 1, 1995 - 8:00:00 AM
Last modification on : Sunday, January 1, 1995 - 8:00:00 AM
Long-term archiving on: Monday, May 10, 2010 - 5:50:45 PM

File

ajp-jp1v5p409.pdf
Explicit agreement for this submission

Identifiers

Collections

AJP

Citation

B. Cessac. Increase in Complexity in Random Neural Networks. Journal de Physique I, EDP Sciences, 1995, 5 (3), pp.409-432. ⟨10.1051/jp1:1995135⟩. ⟨jpa-00247065⟩

Share

Metrics

Record views

97

Files downloads

319