Fast learning rates for plug-in classifiers under the margin condition

Abstract : It has been recently shown that, under the margin (or low noise) assumption, there exist classifiers attaining fast rates of convergence of the excess Bayes risk, i.e., the rates faster than $n^{-1/2}$. The works on this subject suggested the following two conjectures: (i) the best achievable fast rate is of the order $n^{-1}$, and (ii) the plug-in classifiers generally converge slower than the classifiers based on empirical risk minimization. We show that both conjectures are not correct. In particular, we construct plug-in classifiers that can achieve not only the fast, but also the {\it super-fast} rates, i.e., the rates faster than $n^{-1}$. We establish minimax lower bounds showing that the obtained rates cannot be improved.
Type de document :
Pré-publication, Document de travail
36 pages. 2011
Liste complète des métadonnées
Contributeur : Jean-Yves Audibert <>
Soumis le : mardi 24 mai 2011 - 10:31:16
Dernière modification le : lundi 29 mai 2017 - 14:23:51
Document(s) archivé(s) le : jeudi 25 août 2011 - 02:21:26


Fichiers produits par l'(les) auteur(s)




Jean-Yves Audibert, Alexandre Tsybakov. Fast learning rates for plug-in classifiers under the margin condition. 36 pages. 2011. 〈hal-00005882v3〉



Consultations de la notice


Téléchargements de fichiers