REVE: Regularizing Deep Learning with Variational Entropy Bound - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2019

REVE: Regularizing Deep Learning with Variational Entropy Bound

Antoine Saporta
  • Fonction : Auteur
  • PersonId : 1056055
  • IdRef : 266143059
Yifu Chen
Matthieu Cord

Résumé

Studies on generalization performance of machine learning algorithms under the scope of information theory suggest that compressed representations can guarantee good generalization, inspiring many compression-based regularization methods. In this paper, we introduce REVE, a new regularization scheme. Noting that compressing the representation can be sub-optimal, our first contribution is to identify a variable that is directly responsible for the final prediction. Our method aims at compressing the class conditioned entropy of this latter variable. Second, we introduce a variational upper bound on this conditional entropy term. Finally, we propose a scheme to instantiate a tractable loss that is integrated within the training procedure of the neural network and demonstrate its efficiency on different neural networks and datasets.

Dates et versions

hal-02316946 , version 1 (15-10-2019)

Identifiants

Citer

Antoine Saporta, Yifu Chen, Michael Blot, Matthieu Cord. REVE: Regularizing Deep Learning with Variational Entropy Bound. 2019 IEEE International Conference on Image Processing (ICIP), Sep 2019, Taipei, Taiwan. pp.1610-1614, ⟨10.1109/ICIP.2019.8804396⟩. ⟨hal-02316946⟩
76 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More