Boosting Nearest Neighbors for the Efficient Estimation of Posteriors

Abstract : It is an admitted fact that mainstream boosting algorithms like AdaBoost do not perform well to estimate class conditional probabilities. In this paper, we analyze, in the light of this problem, a recent algorithm, unn, which leverages nearest neighbors while minimizing a convex loss. Our contribution is threefold. First, we show that there exists a subclass of surrogate losses, elsewhere called balanced, whose minimization brings simple and statistically efficient estimators for Bayes posteriors. Second, we show explicit convergence rates towards these estimators for \unn, for any such surrogate loss, under a Weak Learning Assumption which parallels that of classical boosting results. Third and last, we provide experiments and comparisons on synthetic and real datasets, including the challenging SUN computer vision database. Results clearly display that boosting nearest neighbors may provide highly accurate estimators, sometimes more than a hundred times more accurate than those of other contenders like support vector machines.
Type de document :
Communication dans un congrès
ECML-PKDD 2012, Sep 2012, Bristol, United Kingdom. pp.16, 2012
Liste complète des métadonnées

https://hal.archives-ouvertes.fr/hal-00702771
Contributeur : Wafa Bel Haj Ali <>
Soumis le : jeudi 31 mai 2012 - 11:55:41
Dernière modification le : dimanche 6 décembre 2015 - 01:03:28
Document(s) archivé(s) le : vendredi 30 novembre 2012 - 12:51:17

Fichier

ecml12-dnbnb-sub.pdf
Fichiers produits par l'(les) auteur(s)

Identifiants

  • HAL Id : hal-00702771, version 1

Collections

I3S | UNICE | BNRMI | GAEL | UGA

Citation

Roberto D'Ambrosio, Richard Nock, Wafa Bel Haj Ali, Frank Nielsen, Michel Barlaud. Boosting Nearest Neighbors for the Efficient Estimation of Posteriors. ECML-PKDD 2012, Sep 2012, Bristol, United Kingdom. pp.16, 2012. <hal-00702771>

Partager

Métriques

Consultations de
la notice

533

Téléchargements du document

94