PAC-Bayesian Theory Meets Bayesian Inference

Abstract : We exhibit a strong link between frequentist PAC-Bayesian bounds and the Bayesian marginal likelihood. That is, for the negative log-likelihood loss function, we show that the minimization of PAC-Bayesian generalization bounds maximizes the Bayesian marginal likelihood. This provides an alternative explanation to the Bayesian Occam's razor criteria, under the assumption that the data is generated by an i.i.d. distribution. Moreover, as the negative log-likelihood is an unbounded loss function, we motivate and propose a PAC-Bayesian theorem tailored for the sub-gamma loss family, and we show that our approach is sound on classical Bayesian linear regression tasks
Complete list of metadatas

https://hal.archives-ouvertes.fr/hal-01324072
Contributor : Pascal Germain <>
Submitted on : Tuesday, November 1, 2016 - 4:04:30 PM
Last modification on : Tuesday, April 24, 2018 - 5:20:15 PM
Long-term archiving on: Tuesday, March 14, 2017 - 5:01:28 PM

File

BPB.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-01324072, version 2
  • ARXIV : 1605.08636

Citation

Pascal Germain, Francis Bach, Alexandre Lacoste, Simon Lacoste-Julien. PAC-Bayesian Theory Meets Bayesian Inference. Neural Information Processing Systems (NIPS 2016), Dec 2016, Barcelone, Spain. ⟨hal-01324072v2⟩

Share

Metrics

Record views

89

Files downloads

79