Bayesian neural networks become heavier-tailed with depth

Mariia Vladimirova 1 Julyan Arbel 1 Pablo Mesejo 2
1 MISTIS - Modelling and Inference of Complex and Structured Stochastic Systems
Inria Grenoble - Rhône-Alpes, LJK - Laboratoire Jean Kuntzmann, INPG - Institut National Polytechnique de Grenoble
Abstract : We investigate deep Bayesian neural networks with Gaussian priors on the weights and ReLU-like nonlinearities, shedding light on novel distribution properties at the level of the neural network units. The main thrust of the paper is to establish that the prior distribution induced on the units before and after activation becomes increasingly heavier-tailed with depth. We show that first layer units are Gaussian, second layer units are sub-Exponential, and we introduce sub-Weibull distributions to characterize the deeper layers units. This result provides new theoretical insight on deep Bayesian neural networks, underpinning their practical potential. The workshop paper is based on the original paper Vladimirova et al. (2018).
Type de document :
Communication dans un congrès
NeurIPS 2018 - Thirty-second Conference on Neural Information Processing Systems, Dec 2018, Montréal, Canada. pp.1-7, 〈http://bayesiandeeplearning.org/〉
Liste complète des métadonnées

https://hal.archives-ouvertes.fr/hal-01950658
Contributeur : Julyan Arbel <>
Soumis le : mardi 11 décembre 2018 - 03:52:54
Dernière modification le : mercredi 9 janvier 2019 - 09:30:08

Fichier

BDL.pdf
Fichiers produits par l'(les) auteur(s)

Identifiants

  • HAL Id : hal-01950658, version 1

Collections

Citation

Mariia Vladimirova, Julyan Arbel, Pablo Mesejo. Bayesian neural networks become heavier-tailed with depth. NeurIPS 2018 - Thirty-second Conference on Neural Information Processing Systems, Dec 2018, Montréal, Canada. pp.1-7, 〈http://bayesiandeeplearning.org/〉. 〈hal-01950658〉

Partager

Métriques

Consultations de la notice

32

Téléchargements de fichiers

14