HAL will be down for maintenance from Friday, June 10 at 4pm through Monday, June 13 at 9am. More information
Skip to Main content Skip to Navigation
Conference papers

Bayesian neural networks become heavier-tailed with depth

Mariia Vladimirova 1 Julyan Arbel 1 Pablo Mesejo 2
1 MISTIS - Modelling and Inference of Complex and Structured Stochastic Systems
Inria Grenoble - Rhône-Alpes, Grenoble INP - Institut polytechnique de Grenoble - Grenoble Institute of Technology, LJK - Laboratoire Jean Kuntzmann
Abstract : We investigate deep Bayesian neural networks with Gaussian priors on the weights and ReLU-like nonlinearities, shedding light on novel distribution properties at the level of the neural network units. The main thrust of the paper is to establish that the prior distribution induced on the units before and after activation becomes increasingly heavier-tailed with depth. We show that first layer units are Gaussian, second layer units are sub-Exponential, and we introduce sub-Weibull distributions to characterize the deeper layers units. This result provides new theoretical insight on deep Bayesian neural networks, underpinning their practical potential. The workshop paper is based on the original paper Vladimirova et al. (2018).
Complete list of metadata

Cited literature [1 references]  Display  Hide  Download

Contributor : Julyan Arbel Connect in order to contact the contributor
Submitted on : Tuesday, December 11, 2018 - 3:52:54 AM
Last modification on : Friday, February 4, 2022 - 3:20:05 AM
Long-term archiving on: : Tuesday, March 12, 2019 - 12:59:06 PM


Files produced by the author(s)


  • HAL Id : hal-01950658, version 1



Mariia Vladimirova, Julyan Arbel, Pablo Mesejo. Bayesian neural networks become heavier-tailed with depth. NeurIPS 2018 - Thirty-second Conference on Neural Information Processing Systems, Dec 2018, Montréal, Canada. pp.1-7. ⟨hal-01950658⟩



Record views


Files downloads