Statistical learning with indirect observations

Abstract : Let $(X,Y)\in\mathcal{X}\times \mathcal{Y}$ be a random couple with unknown distribution $P$. Let $\GG$ be a class of measurable functions and $\ell$ a loss function. The problem of statistical learning deals with the estimation of the Bayes: $$g^*=\arg\min_{g\in\GG}\E_P \ell(g(X),Y). $$ In this paper, we study this problem when we deal with a contaminated sample $(Z_1,Y_1),\ldots , (Z_n,Y_n)$ of i.i.d. indirect observations. Each input $Z_i$, $i=1,\ldots ,n$ is distributed from a density $Af$, where $A$ is a known compact linear operator and $f$ is the density of the direct input $X$. \\ We derive fast rates of convergence for empirical risk minimizers based on regularization methods, such as deconvolution kernel density estimators or spectral cut-off. These results are comparable to the existing fast rates in \cite{kolt} for the direct case. It gives some insights into the effect of indirect measurements in the presence of fast rates of convergence.
Liste complète des métadonnées

https://hal.archives-ouvertes.fr/hal-00664125
Contributor : Sébastien Loustau <>
Submitted on : Tuesday, July 10, 2012 - 10:14:13 AM
Last modification on : Wednesday, December 19, 2018 - 2:08:04 PM
Document(s) archivé(s) le : Thursday, October 11, 2012 - 2:30:08 AM

Files

noisystatlearn.pdf
Publisher files allowed on an open archive

Identifiers

  • HAL Id : hal-00664125, version 3
  • ARXIV : 1201.6115

Collections

Citation

Sébastien Loustau. Statistical learning with indirect observations. 2012. 〈hal-00664125v3〉

Share

Metrics

Record views

287

Files downloads

80