Manifold based local classifiers: linear and nonlinear approaches

Hakan Cevikalp 1 Diane Larlus 2 Mike Neamtu 3 William Triggs 4 Frédéric Jurie 2, 5
2 LEAR - Learning and recognition in vision
Inria Grenoble - Rhône-Alpes, LJK - Laboratoire Jean Kuntzmann, INPG - Institut National Polytechnique de Grenoble
4 AI - Artificial Intelligence
LJK - Laboratoire Jean Kuntzmann
5 Equipe Image - Laboratoire GREYC - UMR6072
GREYC - Groupe de Recherche en Informatique, Image, Automatique et Instrumentation de Caen
Abstract : In case of insufficient data samples in highdimensional classification problems, sparse scatters of samples tend to have many 'holes'--regions that have few or no nearby training samples from the class. When such regions lie close to inter-class boundaries, the nearest neighbors of a query may lie in the wrong class, thus leading to errors in the Nearest Neighbor classification rule. The K-local hyperplane distance nearest neighbor (HKNN) algorithm tackles this problem by approximating each class with a smooth nonlinear manifold, which is considered to be locally linear. The method takes advantage of the local linearity assumption by using the distances from a query sample to the affine hulls of query's nearest neighbors for decision making. However, HKNN is limited to using the Euclidean distance metric, which is a significant limitation in practice. In this paper we reformulate HKNN in terms of subspaces, and propose a variant, the Local Discriminative Common Vector (LDCV) method, that is more suitable for classification tasks where the classes have similar intra-class variations. We then extend both methods to the nonlinear case by mapping the nearest neighbors into a higherdimensional space where the linear manifolds are constructed. This procedure allows us to use a wide variety of distance functions in the process, while computing distances between the query sample and the nonlinear manifolds remains straightforward owing to the linear nature of the manifolds in the mapped space. We tested the proposed methods on several classification tasks, obtaining better results than both the Support Vector Machines (SVMs) and their local counterpart SVM-KNN on the USPS and Image segmentation databases, and outperforming the local SVMKNN on the Caltech visual recognition database.
Type de document :
Article dans une revue
Journal of Signal Processing Systems, Springer, 2010, 61 (1), pp.61-73. <10.1007/s11265-008-0313-4>
Liste complète des métadonnées


https://hal.archives-ouvertes.fr/hal-00565007
Contributeur : William Triggs <>
Soumis le : jeudi 10 février 2011 - 16:59:25
Dernière modification le : mercredi 27 mai 2015 - 10:50:48
Document(s) archivé(s) le : mercredi 11 mai 2011 - 03:20:22

Fichier

Cevikalp-jsps10.pdf
Fichiers produits par l'(les) auteur(s)

Identifiants

Citation

Hakan Cevikalp, Diane Larlus, Mike Neamtu, William Triggs, Frédéric Jurie. Manifold based local classifiers: linear and nonlinear approaches. Journal of Signal Processing Systems, Springer, 2010, 61 (1), pp.61-73. <10.1007/s11265-008-0313-4>. <hal-00565007>

Partager

Métriques

Consultations de
la notice

490

Téléchargements du document

310