Skip to Main content Skip to Navigation
Conference papers

Bayesian neural networks become heavier-tailed with depth

Mariia Vladimirova 1 Julyan Arbel 1 Pablo Mesejo 2
1 MISTIS - Modelling and Inference of Complex and Structured Stochastic Systems
Inria Grenoble - Rhône-Alpes, LJK - Laboratoire Jean Kuntzmann, INPG - Institut National Polytechnique de Grenoble
Abstract : We investigate deep Bayesian neural networks with Gaussian priors on the weights and ReLU-like nonlinearities, shedding light on novel distribution properties at the level of the neural network units. The main thrust of the paper is to establish that the prior distribution induced on the units before and after activation becomes increasingly heavier-tailed with depth. We show that first layer units are Gaussian, second layer units are sub-Exponential, and we introduce sub-Weibull distributions to characterize the deeper layers units. This result provides new theoretical insight on deep Bayesian neural networks, underpinning their practical potential. The workshop paper is based on the original paper Vladimirova et al. (2018).
Complete list of metadatas

Cited literature [1 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-01950658
Contributor : Julyan Arbel <>
Submitted on : Tuesday, December 11, 2018 - 3:52:54 AM
Last modification on : Thursday, March 26, 2020 - 8:49:33 PM
Document(s) archivé(s) le : Tuesday, March 12, 2019 - 12:59:06 PM

File

BDL.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-01950658, version 1

Collections

Citation

Mariia Vladimirova, Julyan Arbel, Pablo Mesejo. Bayesian neural networks become heavier-tailed with depth. NeurIPS 2018 - Thirty-second Conference on Neural Information Processing Systems, Dec 2018, Montréal, Canada. pp.1-7. ⟨hal-01950658⟩

Share

Metrics

Record views

158

Files downloads

90