Risk bounds in linear regression through PAC-Bayesian truncation - Archive ouverte HAL Access content directly
Preprints, Working Papers, ... Year : 2010

Risk bounds in linear regression through PAC-Bayesian truncation

Abstract

We consider the problem of predicting as well as the best linear combination of d given functions in least squares regression, and variants of this problem including constraints on the parameters of the linear combination. When the input distribution is known, there already exists an algorithm having an expected excess risk of order d/n, where n is the size of the training data. Without this strong assumption, standard results often contain a multiplicative log n factor, and require some additional assumptions like uniform boundedness of the d-dimensional input representation and exponential moments of the output. This work provides new risk bounds for the ridge estimator and the ordinary least squares estimator, and their variants. It also provides shrinkage procedures with convergence rate d/n (i.e., without the logarithmic factor) in expectation and in deviations, under various assumptions. The key common surprising factor of these results is the absence of exponential moment condition on the output distribution while achieving exponential deviations. All risk bounds are obtained through a PAC-Bayesian analysis on truncated differences of losses. Finally, we show that some of these results are not particular to the least squares loss, but can be generalized to similar strongly convex loss functions.
Fichier principal
Vignette du fichier
dovern_pour_hal.pdf (1003.09 Ko) Télécharger le fichier
Origin : Files produced by the author(s)
Loading...

Dates and versions

hal-00360268 , version 1 (10-02-2009)
hal-00360268 , version 2 (03-07-2010)

Identifiers

Cite

Jean-Yves Audibert, Olivier Catoni. Risk bounds in linear regression through PAC-Bayesian truncation. 2010. ⟨hal-00360268v2⟩
609 View
465 Download

Altmetric

Share

Gmail Facebook X LinkedIn More