Fast Newton Nearest Neighbors Boosting For Image Classification - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2013

Fast Newton Nearest Neighbors Boosting For Image Classification

Résumé

Recent works display that large scale image classification problems rule out computationally demanding methods. On such problems, simple approaches like k-NN are affordable contenders, with still room space for statistical improvements under the algorithmic constraints. A recent work showed how to leverage k-NN to yield a formal boosting algorithm. This method, however, has numerical issues that make it not suited for large scale problems. We propose here an Adaptive Newton-Raphson scheme to leverage k-NN, N3, which does not suffer these issues. We show that it is a boosting algorithm, with several key algorithmic and statistical properties. In particular, it may be sufficient to boost a subsample to reach desired bounds for the loss at hand in the boosting framework. Experiments are provided on the SUN, and Caltech databases. They confirm that boosting a subsample -- sometimes containing few examples only -- is sufficient to reach the convergence regime of N3. Under such conditions, N3 challenges the accuracy of contenders with lower computational cost and lower memory requirement.

Mots clés

Fichier principal
Vignette du fichier
BNNB_mlsp2013.pdf (161.55 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-00959125 , version 1 (14-03-2014)

Identifiants

  • HAL Id : hal-00959125 , version 1

Citer

Wafa Bel Haj Ali, Richard Nock, Franck Nielsen, Michel Barlaud. Fast Newton Nearest Neighbors Boosting For Image Classification. MLSP - 23rd Workshop on Machine Learning for Signal Processing, Sep 2013, Southampton, United Kingdom. pp.6. ⟨hal-00959125⟩
1170 Consultations
230 Téléchargements

Partager

Gmail Facebook X LinkedIn More