Small variance asymptotics and bayesian nonparametrics for dictionary learning - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2018

Small variance asymptotics and bayesian nonparametrics for dictionary learning

Résumé

Bayesian nonparametric (BNP) is an appealing framework to infer the complexity of a model along with the parameters. To this aim, sampling or variational methods are often used for inference. However, these methods come with numerical disadvantages for large-scale data. An alternative approach is to relax the probabilistic model into a non-probabilistic formulation which yields a scalable algorithm. One limitation of BNP approaches can be the cost of Monte-Carlo sampling for inference. Small-variance asymptotic (SVA) approaches paves the way to much cheaper though approximate methods for inference by taking benefit from a fruitful interaction between Bayesian models and optimization algorithms. In brief, SVA lets the variance of the noise (or residual error) distribution tend to zero in the optimization problem corresponding to a MAP estimator with finite noise variance for instance. We propose such an SVA analysis of a BNP dictionary learning (DL) approach that automatically adapts the size of the dictionary or the subspace dimension in an efficient way. Numerical experiments illustrate the efficiency of the proposed method.
Fichier principal
Vignette du fichier
2018_EUSIPCO.pdf (413.98 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-01961852 , version 1 (20-12-2018)

Identifiants

Citer

Clément Elvira, Hong-Phuong Dang, Pierre Chainais. Small variance asymptotics and bayesian nonparametrics for dictionary learning. EUSIPCO 2018 - 26th European Signal Processing Conference, Sep 2018, Rome, Italy. pp.1607-1611, ⟨10.23919/EUSIPCO.2018.8553142⟩. ⟨hal-01961852⟩
213 Consultations
157 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More