Skip to Main content Skip to Navigation
Preprints, Working Papers, ...

Learning with BOT - Bregman and Optimal Transport divergences

Abstract : The introduction of the Kullback-Leibler divergence in PAC-Bayesian theory can be traced back to the work of [1]. It allows to design learning procedure with generalization errors based on an optimal trade-off between accuracy on the training set, and complexity. This complexity is penalized thanks to the Kullback-Leibler divergence from a prior distribution, modeling a domain knowledge over the set of candidates or weak learners. In the context of high dimensional statistics, it gives rise to sparsity oracle inequalities or more recently sparsity regret bounds, where the complexity is measured thanks to 0 or 1 −norms. In this paper, we propose to extend the PAC-Bayesian theory to get more generic regret bounds for sequential weighted averages, where (1) the measure of complexity is based on any ad-hoc criterion and (2) the prior distribution could be very simple. These results arise by introducing a new measure of divergences from the prior in terms of Bregman divergence or Optimal Transport.
Document type :
Preprints, Working Papers, ...
Complete list of metadata
Contributor : Sébastien Loustau <>
Submitted on : Friday, June 18, 2021 - 8:12:58 AM
Last modification on : Tuesday, June 22, 2021 - 3:45:54 AM


Files produced by the author(s)


  • HAL Id : hal-03262687, version 2



Andrew Chee, Sébastien Loustau. Learning with BOT - Bregman and Optimal Transport divergences. 2021. ⟨hal-03262687v2⟩



Record views


Files downloads