Nonlinear feedforward networks with stochastic outputs: infomax implies redundancy reduction. - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue Network Année : 1998

Nonlinear feedforward networks with stochastic outputs: infomax implies redundancy reduction.

Résumé

We prove that maximization of mutual information between the output and the input of a feedforward neural network leads to full redundancy reduction under the following sufficient conditions: (i) the input signal is a (possibly nonlinear) invertible mixture of independent components; (ii) there is no input noise; (iii) the activity of each output neuron is a (possibly) stochastic variable with a probability distribution depending on the stimulus through a deterministic function of the inputs (where both the probability distributions and the functions can be different from neuron to neuron); (iv) optimization of the mutual information is performed over all these deterministic functions. This result extends that obtained by Nadal and Parga (1994) who considered the case of deterministic outputs.
Fichier non déposé

Dates et versions

hal-00143780 , version 1 (26-04-2007)

Identifiants

Citer

J. P. Nadal, N. Brunel, N. Parga. Nonlinear feedforward networks with stochastic outputs: infomax implies redundancy reduction.. Network, 1998, 9 (2), pp.207-17. ⟨10.1088/0954-898X/9/2/004⟩. ⟨hal-00143780⟩
148 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More