The committee machine: Computational to statistical gaps in learning a two-layers neural network - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue Advances in Neural Information Processing Systems Année : 2018

The committee machine: Computational to statistical gaps in learning a two-layers neural network

Résumé

Heuristic tools from statistical physics have been used in the past to locate the phase transitions and compute the optimal learning and generalization errors in the teacher-student scenario in multi-layer neural networks. In this contribution, we provide a rigorous justiication of these approaches for a two-layers neural network model called the committee machine. We also introduce a version of the approximate message passing (AMP) algorithm for the committee machine that allows to perform optimal learning in polynomial time for a large set of parameters. We nd that there are regimes in which a low generalization error is information-theoretically achievable while the AMP algorithm fails to deliver it, strongly suggesting that no eecient algorithm exists for those cases, and unveiling a large computational gap.
Fichier principal
Vignette du fichier
comittee_machine.pdf (976.3 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

cea-01933130 , version 1 (23-11-2018)

Identifiants

  • HAL Id : cea-01933130 , version 1

Citer

Benjamin Aubin, Antoine Maillard, Jean Barbier, Florent Krzakala, Nicolas Macris, et al.. The committee machine: Computational to statistical gaps in learning a two-layers neural network. Advances in Neural Information Processing Systems, 2018, 31, pp.3227-3238. ⟨cea-01933130⟩
128 Consultations
200 Téléchargements

Partager

Gmail Facebook X LinkedIn More