Skip to Main content Skip to Navigation
Conference papers

REVE: Regularizing Deep Learning with Variational Entropy Bound

Abstract : Studies on generalization performance of machine learning algorithms under the scope of information theory suggest that compressed representations can guarantee good generalization, inspiring many compression-based regularization methods. In this paper, we introduce REVE, a new regularization scheme. Noting that compressing the representation can be sub-optimal, our first contribution is to identify a variable that is directly responsible for the final prediction. Our method aims at compressing the class conditioned entropy of this latter variable. Second, we introduce a variational upper bound on this conditional entropy term. Finally, we propose a scheme to instantiate a tractable loss that is integrated within the training procedure of the neural network and demonstrate its efficiency on different neural networks and datasets.
Document type :
Conference papers
Complete list of metadata

https://hal.archives-ouvertes.fr/hal-02316946
Contributor : Antoine Saporta <>
Submitted on : Tuesday, October 15, 2019 - 4:49:33 PM
Last modification on : Tuesday, March 23, 2021 - 9:28:03 AM

Links full text

Identifiers

Citation

Antoine Saporta, Yifu Chen, Michael Blot, Matthieu Cord. REVE: Regularizing Deep Learning with Variational Entropy Bound. 2019 IEEE International Conference on Image Processing (ICIP), Sep 2019, Taipei, Taiwan. pp.1610-1614, ⟨10.1109/ICIP.2019.8804396⟩. ⟨hal-02316946⟩

Share

Metrics

Record views

116