Skip to Main content Skip to Navigation
Preprints, Working Papers, ...

Bandwidth selection in kernel empirical risk minimization via the gradient

Abstract : In this paper, we deal with the data-driven selection of multidimensional and (possibly) anisotropic bandwidths in the general problem of kernel empirical risk minimization. We propose a universal selection rule, which leads to optimal adaptive results in a large variety of statistical models such as nonparametric regression or statistical learning with errors-in-variables. These results are stated in the context of smooth loss functions, where the gradient of the risk appears as a good criterion to measure the performance of our estimators. This turns out to be helpful to derive excess risk bounds - with fast rates of convergence - in noisy clustering as well as adaptive minimax results for pointwise and global estimation in robust nonparametric regression. The selection rule consists of a comparison of the gradient empirical risks. It can be viewed as a non-trivial improvement of the so-called GL method (Goldenshluger and Lepski, 2011) to non-linear estimators. Another main advantage of our selection rule is the non-dependency on the smallest eigenvalue of the Hessian matrix of the risk, which is a changing and unknown parameter determined by the underlying model.
Complete list of metadatas

Cited literature [51 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-00937026
Contributor : Sébastien Loustau <>
Submitted on : Thursday, January 30, 2014 - 3:58:55 PM
Last modification on : Monday, March 9, 2020 - 6:15:53 PM
Document(s) archivé(s) le : Sunday, April 9, 2017 - 12:12:16 AM

File

glerc.pdf
Publisher files allowed on an open archive

Identifiers

  • HAL Id : hal-00937026, version 1

Collections

Citation

Michaël Chichignoud, Sébastien Loustau. Bandwidth selection in kernel empirical risk minimization via the gradient. 2014. ⟨hal-00937026⟩

Share

Metrics

Record views

722

Files downloads

282