Prosopagnosia in high capacity neural networks storing uncorrelated classes
Résumé
We display a synaptic matrix that can efficiently store, in attractor neural networks (ANN) and perceptrons, patterns organized in uncorrelated classes. We find a storage capacity limit increasing with m, the overlap of a pattern with its class ancestor, and diverging as m → 1. The probability distribution of the local stability parameters is studied, leading to a complete analysis of the performance of a perceptron with this synaptic matrix, and to a qualitative understanding of the behavior of the corresponding ANN. The analysis of the retrieval attractor of the ANN is completed via statistical mechanics. The motivation for the construction of this matrix was to make possible a study of a model for prosopagnosia, i.e. the shift from individual to class recall, under lesion, i.e. a random deterioration of the synaptic efficacies. The retrieval properties of the model with the proposed synaptic matrix, affected by random synaptic dilution are studied in detail. Finally we compare our synaptic matrix with a generic matrix which has all positive stability parameters.
Domaines
Articles anciens
Origine : Accord explicite pour ce dépôt