Skip to Main content Skip to Navigation
New interface
Preprints, Working Papers, ...

A General Framework for the Disintegration of PAC-Bayesian Bounds

Abstract : PAC-Bayesian bounds are known to be tight and informative when studying the generalization ability of randomized classifiers. However, when applied to some family of deterministic models such as neural networks, they require a loose and costly derandomization step. As an alternative to this step, we introduce new PAC-Bayesian generalization bounds that have the originality to provide disintegrated bounds, i.e., they give guarantees over one single hypothesis instead of the usual averaged analysis. Our bounds are easily optimizable and can be used to design learning algorithms. We illustrate the interest of our result on neural networks and show a significant practical improvement over the state-of-the-art framework.
Document type :
Preprints, Working Papers, ...
Complete list of metadata
Contributor : Paul Viallard Connect in order to contact the contributor
Submitted on : Friday, October 8, 2021 - 10:10:48 AM
Last modification on : Saturday, June 25, 2022 - 7:27:38 PM


Files produced by the author(s)


  • HAL Id : hal-03143025, version 2
  • ARXIV : 2102.08649



Paul Viallard, Pascal Germain, Amaury Habrard, Emilie Morvant. A General Framework for the Disintegration of PAC-Bayesian Bounds. {date}. ⟨hal-03143025v2⟩



Record views


Files downloads