Régression: bases déformées et sélection de modèles par pénalisation et méthode de Lepski

Abstract : This paper deals with the problem of estimating a regression function $f$, in a random design framework. We build and study two adaptive estimators based on model selection, applied with warped bases. We start with a collection of finite dimensional linear spaces, spanned by orthonormal bases. Instead of expanding directly the target function $f$ on these bases, we rather consider the expansion of $h=f\circ G^{-1}$, where $G$ is the cumulative distribution function of the design, following Kerkyacharian and Picard (2004). The data-driven selection of the (best) space is done with two strategies: we use both a penalization version of a "warped contrast", and a model selection device in the spirit of Goldenshluger and Lepski (2011). We propose by these methods two functions, $\hat{h}_l$ ($l=1,2$), easier to compute than least-squares estimators. We establish nonasymptotic mean-squared integrated risk bounds for the resulting estimators, $\hat{f}_l=\hat{h}_l\circ G$ if $G$ is known, or $\hat{f}_l=\hat{h}_l\circ\hat{G}$ ($l=1,2$) otherwise, where $\hat{G}$ is the empirical distribution function. We study also adaptive properties, in case the regression function belongs to a Besov or Sobolev space, and compare the theoretical and practical performances of the two selection rules.
Document type :
Preprints, Working Papers, ...
Complete list of metadatas

Cited literature [27 references]  Display  Hide  Download

Contributor : Gaëlle Chagny <>
Submitted on : Wednesday, October 26, 2011 - 4:31:40 PM
Last modification on : Friday, September 20, 2019 - 4:34:02 PM
Long-term archiving on: Friday, January 27, 2012 - 2:34:37 AM


Files produced by the author(s)


  • HAL Id : hal-00519556, version 2



Gaëlle Chagny. Régression: bases déformées et sélection de modèles par pénalisation et méthode de Lepski. 2011. ⟨hal-00519556v2⟩



Record views


Files downloads