Sparse neural networks with large learning diversity - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue IEEE Transactions on Neural Networks Année : 2011

Sparse neural networks with large learning diversity

Résumé

Coded recurrent neural networks with three levels of sparsity are introduced. The first level is related to the size of messages that are much smaller than the number of available neurons. The second one is provided by a particular coding rule, acting as a local constraint in the neural activity. The third one is a characteristic of the low final connection density of the network after the learning phase. Though the proposed network is very simple since it is based on binary neurons and binary connections, it is able to learn a large number of messages and recall them, even in presence of strong erasures. The performance of the network is assessed as a classifier and as an associative memory.

Dates et versions

hal-00609246 , version 1 (18-07-2011)

Identifiants

Citer

Vincent Gripon, Claude Berrou. Sparse neural networks with large learning diversity. IEEE Transactions on Neural Networks, 2011, 22 (7), pp.1087 - 1096. ⟨10.1109/TNN.2011.2146789⟩. ⟨hal-00609246⟩
105 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More