# Data-driven calibration of linear estimators with minimal penalties

2 SIERRA - Statistical Machine Learning and Parsimony
DI-ENS - Département d'informatique de l'École normale supérieure, ENS Paris - École normale supérieure - Paris, Inria Paris-Rocquencourt, CNRS - Centre National de la Recherche Scientifique : UMR8548
Abstract : This paper tackles the problem of selecting among several linear estimators in non-parametric regression; this includes model selection for linear regression, the choice of a regularization parameter in kernel ridge regression, spline smoothing or locally weighted regression, and the choice of a kernel in multiple kernel learning. We propose a new algorithm which first estimates consistently the variance of the noise, based upon the concept of minimal penalty, which was previously introduced in the context of model selection. Then, plugging our variance estimate in Mallows' $C_L$ penalty is proved to lead to an algorithm satisfying an oracle inequality. Simulation experiments with kernel ridge regression and multiple kernel learning show that the proposed algorithm often improves significantly existing calibration procedures such as generalized cross-validation.
keyword :
Document type :
Conference papers
NIPS 2009 - Advances in Neural Information Processing Systems, Dec 2009, Vancouver, Canada. 22, pp.46--54, 2009
Domain :

https://hal.archives-ouvertes.fr/hal-00414774
Contributor : Sylvain Arlot <>
Submitted on : Monday, September 12, 2011 - 4:57:09 PM
Last modification on : Thursday, September 29, 2016 - 1:22:16 AM
Document(s) archivé(s) le : Tuesday, December 13, 2011 - 2:20:32 AM

### Files

minikernel_journal.pdf
Files produced by the author(s)

### Identifiers

• HAL Id : hal-00414774, version 2
• ARXIV : 0909.1884

### Citation

Sylvain Arlot, Francis Bach. Data-driven calibration of linear estimators with minimal penalties. NIPS 2009 - Advances in Neural Information Processing Systems, Dec 2009, Vancouver, Canada. 22, pp.46--54, 2009. <hal-00414774v2>

Record views