k-NN Boosting Prototype Learning for Object Classification - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2010

k-NN Boosting Prototype Learning for Object Classification

Résumé

Object classification is a challenging task in computer vision. Many approaches have been proposed to extract meaningful descriptors from images and classifying them in a supervised learning framework. In this paper, we revisit the classic k-nearest neighbors (k-NN) classification rule, which has shown to be very effective when dealing with local image descriptors. However, k-NN still features some major drawbacks, mainly due to the uniform voting among the nearest prototypes in the feature space. In this paper, we propose a generalization of the classic k-NN rule in a supervised learning (boosting) framework. Namely, we redefine the voting rule as a strong classifier that linearly combines predictions from the k closest prototypes. To induce this classifier, we propose a novel learning algorithm, MLNN (Multiclass Leveraged Nearest Neighbors), which gives a simple procedure for performing prototype selection very efficiently. We tested our method on 12 categories of objects, and observed significant improvement over classic k-NN in terms of classification performances.
Fichier principal
Vignette du fichier
pbnn_wiamis_10.pdf (483.32 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-00481725 , version 1 (07-05-2010)

Identifiants

  • HAL Id : hal-00481725 , version 1

Citer

Paolo Piro, Michel Barlaud, Richard Nock, Frank Nielsen. k-NN Boosting Prototype Learning for Object Classification. WIAMIS 2010 - 11th Workshop on Image Analysis for Multimedia Interactive Services, Apr 2010, Desenzano del Garda, Italy. pp.1-4. ⟨hal-00481725⟩
180 Consultations
113 Téléchargements

Partager

Gmail Facebook X LinkedIn More