PAC-Bayesian Theory Meets Bayesian Inference

Abstract : We exhibit a strong link between frequentist PAC-Bayesian risk bounds and the Bayesian marginal likelihood. That is, for the negative log-likelihood loss function, we show that the minimization of PAC-Bayesian generalization risk bounds maximizes the Bayesian marginal likelihood. This provides an alternative explanation to the Bayesian Occam's razor criteria, under the assumption that the data is generated by an i.i.d distribution. Moreover, as the negative log-likelihood is an unbounded loss function, we motivate and propose a PAC-Bayesian theorem tailored for the sub-gamma loss family, and we show that our approach is sound on classical Bayesian linear regression tasks.
Type de document :
Communication dans un congrès
Neural Information Processing Systems (NIPS 2016), Dec 2016, Barcelone, Spain. pp.1876-1884, 2016, Proceedings of the Neural Information Processing Systems Conference 29
Liste complète des métadonnées

Littérature citée [48 références]  Voir  Masquer  Télécharger

https://hal.archives-ouvertes.fr/hal-01324072
Contributeur : Pascal Germain <>
Soumis le : mardi 14 février 2017 - 13:49:43
Dernière modification le : jeudi 26 avril 2018 - 10:29:13
Document(s) archivé(s) le : lundi 15 mai 2017 - 14:13:12

Fichier

BPB.pdf
Fichiers produits par l'(les) auteur(s)

Identifiants

  • HAL Id : hal-01324072, version 3
  • ARXIV : 1605.08636

Collections

Citation

Pascal Germain, Francis Bach, Alexandre Lacoste, Simon Lacoste-Julien. PAC-Bayesian Theory Meets Bayesian Inference. Neural Information Processing Systems (NIPS 2016), Dec 2016, Barcelone, Spain. pp.1876-1884, 2016, Proceedings of the Neural Information Processing Systems Conference 29. 〈hal-01324072v3〉

Partager

Métriques

Consultations de la notice

381

Téléchargements de fichiers

200