Data-driven calibration of linear estimators with minimal penalties - Archive ouverte HAL Access content directly
Conference Papers Year : 2009

Data-driven calibration of linear estimators with minimal penalties

Abstract

This paper tackles the problem of selecting among several linear estimators in non-parametric regression; this includes model selection for linear regression, the choice of a regularization parameter in kernel ridge regression, spline smoothing or locally weighted regression, and the choice of a kernel in multiple kernel learning. We propose a new algorithm which first estimates consistently the variance of the noise, based upon the concept of minimal penalty, which was previously introduced in the context of model selection. Then, plugging our variance estimate in Mallows' $C_L$ penalty is proved to lead to an algorithm satisfying an oracle inequality. Simulation experiments with kernel ridge regression and multiple kernel learning show that the proposed algorithm often improves significantly existing calibration procedures such as generalized cross-validation.
Fichier principal
Vignette du fichier
minikernel_journal.pdf (409.37 Ko) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-00414774 , version 1 (09-09-2009)
hal-00414774 , version 2 (12-09-2011)

Identifiers

Cite

Sylvain Arlot, Francis Bach. Data-driven calibration of linear estimators with minimal penalties. NIPS 2009 - Advances in Neural Information Processing Systems, Dec 2009, Vancouver, Canada. pp.46--54. ⟨hal-00414774v2⟩
458 View
588 Download

Altmetric

Share

Gmail Facebook X LinkedIn More