Dictionary Learning with Large Step Gradient Descent for Sparse Representations - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2012

Dictionary Learning with Large Step Gradient Descent for Sparse Representations

Résumé

This work presents a new algorithm for dictionary learning. Existing algorithms such as MOD and K-SVD often fail to find the best dictionary because they get trapped in a local minimum. Olshausen and Field's Sparsenet algorithm relies on a fixed step projected gradient descent. With the right step, it can avoid local minima and converge towards the global minimum. The problem then becomes to find the right step size. In this work we provide the expression of the optimal step for the gradient descent but the step we use is twice as large as the optimal step. That large step allows the descent to bypass local minima and yields significantly better results than existing algorithms. The algorithms are compared on synthetic data. Our method outperforms existing algorithms both in approximation quality and in perfect recovery rate if an oracle support for the sparse representation is provided.
Fichier principal
Vignette du fichier
llncs.pdf (1.64 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-00688368 , version 1 (17-04-2012)

Identifiants

Citer

Boris Mailhé, Mark D. Plumbley. Dictionary Learning with Large Step Gradient Descent for Sparse Representations. LVA/ICA 2012, Mar 2012, Tel-Aviv, Israel. pp.231-238, ⟨10.1007/978-3-642-28551-6_29⟩. ⟨hal-00688368⟩
138 Consultations
483 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More