Fast Newton Nearest Neighbors Boosting For Image Classification

Abstract : Recent works display that large scale image classification problems rule out computationally demanding methods. On such problems, simple approaches like k-NN are affordable contenders, with still room space for statistical improvements under the algorithmic constraints. A recent work showed how to leverage k-NN to yield a formal boosting algorithm. This method, however, has numerical issues that make it not suited for large scale problems. We propose here an Adaptive Newton-Raphson scheme to leverage k-NN, N3, which does not suffer these issues. We show that it is a boosting algorithm, with several key algorithmic and statistical properties. In particular, it may be sufficient to boost a subsample to reach desired bounds for the loss at hand in the boosting framework. Experiments are provided on the SUN, and Caltech databases. They confirm that boosting a subsample -- sometimes containing few examples only -- is sufficient to reach the convergence regime of N3. Under such conditions, N3 challenges the accuracy of contenders with lower computational cost and lower memory requirement.
Keywords : Machine learning
Liste complète des métadonnées

Cited literature [19 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-00959125
Contributor : Estelle Nivault <>
Submitted on : Friday, March 14, 2014 - 9:05:43 AM
Last modification on : Thursday, February 7, 2019 - 4:15:16 PM
Document(s) archivé(s) le : Saturday, June 14, 2014 - 10:50:56 AM

File

BNNB_mlsp2013.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-00959125, version 1

Collections

Citation

Wafa Bel Haj Ali, Richard Nock, Franck Nielsen, Michel Barlaud. Fast Newton Nearest Neighbors Boosting For Image Classification. MLSP - 23rd Workshop on Machine Learning for Signal Processing, Sep 2013, Southampton, United Kingdom. pp.6. ⟨hal-00959125⟩

Share

Metrics

Record views

2073

Files downloads

245