Weakly supervised learning from scale invariant feature transform keypoints: an approach combining fast eigendecompostion, regularization, and diffusion on graphs
Résumé
We propose a unified approach to propagate knowledge into a high-dimensional space from a small informative set, in this case, scale invariant feature transform (SIFT) features. Our contribution lies in three aspects. First, we propose a spectral graph embedding of the SIFT points for dimensionality reduction, which provides efficient keypoints transcription into a Euclidean manifold. We use iterative deflation to speed up the eigendecomposition of the underlying Laplacian matrix of the embedded graph. Then, we describe a variational framework for manifold denoising based on p -Laplacian to enhance keypoints classification, thereby lessening the negative impact of outliers onto our variational shape framework and achieving higher classification accuracy through agglomerative categorization. Finally, we describe our algorithm for multilabel diffusion on graph. Theoretical analysis of the algorithm is developed along with the corresponding connections with other methods. Tests have been conducted on a collection of images from the Berkeley database. Performance evaluation results show that our framework allows us to efficiently propagate the prior knowledge.
Origine : Fichiers produits par l'(les) auteur(s)
Loading...