Skip to Main content Skip to Navigation
Preprints, Working Papers, ...

Fast rates for noisy clustering

Abstract : The effect of errors in variables in empirical minimization is investigated. Given a loss $l$ and a set of decision rules $\mathcal{G}$, we prove a general upper bound for an empirical minimization based on a deconvolution kernel and a noisy sample $Z_i=X_i+\epsilon_i,i=1,\ldots,n$. We apply this general upper bound to give the rate of convergence for the expected excess risk in noisy clustering. A recent bound from \citet{levrard} proves that this rate is $\mathcal{O}(1/n)$ in the direct case, under Pollard's regularity assumptions. Here the effect of noisy measurements gives a rate of the form $\mathcal{O}(1/n^{\frac{\gamma}{\gamma+2\beta}})$, where $\gamma$ is the Hölder regularity of the density of $X$ whereas $\beta$ is the degree of illposedness.
Complete list of metadata
Contributor : Sébastien Loustau Connect in order to contact the contributor
Submitted on : Monday, May 7, 2012 - 4:05:51 PM
Last modification on : Wednesday, October 20, 2021 - 3:18:53 AM
Long-term archiving on: : Wednesday, August 8, 2012 - 2:35:33 AM


Files produced by the author(s)


  • HAL Id : hal-00695258, version 1
  • ARXIV : 1205.1417



Sébastien Loustau. Fast rates for noisy clustering. 2012. ⟨hal-00695258⟩



Les métriques sont temporairement indisponibles