Skip to Main content Skip to Navigation
Conference papers

PAC-Bayesian Theory Meets Bayesian Inference

Abstract : We exhibit a strong link between frequentist PAC-Bayesian risk bounds and the Bayesian marginal likelihood. That is, for the negative log-likelihood loss function, we show that the minimization of PAC-Bayesian generalization risk bounds maximizes the Bayesian marginal likelihood. This provides an alternative explanation to the Bayesian Occam's razor criteria, under the assumption that the data is generated by an i.i.d distribution. Moreover, as the negative log-likelihood is an unbounded loss function, we motivate and propose a PAC-Bayesian theorem tailored for the sub-gamma loss family, and we show that our approach is sound on classical Bayesian linear regression tasks.
Complete list of metadatas

Cited literature [48 references]  Display  Hide  Download
Contributor : Pascal Germain <>
Submitted on : Tuesday, February 14, 2017 - 1:49:43 PM
Last modification on : Tuesday, September 22, 2020 - 3:57:48 AM
Long-term archiving on: : Monday, May 15, 2017 - 2:13:12 PM


Files produced by the author(s)


  • HAL Id : hal-01324072, version 3
  • ARXIV : 1605.08636



Pascal Germain, Francis Bach, Alexandre Lacoste, Simon Lacoste-Julien. PAC-Bayesian Theory Meets Bayesian Inference. Neural Information Processing Systems (NIPS 2016), Dec 2016, Barcelone, Spain. pp.1876-1884. ⟨hal-01324072v3⟩



Record views


Files downloads