Small variance asymptotics and bayesian nonparametrics for dictionary learning

Abstract : Bayesian nonparametric (BNP) is an appealing framework to infer the complexity of a model along with the parameters. To this aim, sampling or variational methods are often used for inference. However, these methods come with numerical disadvantages for large-scale data. An alternative approach is to relax the probabilistic model into a non-probabilistic formulation which yields a scalable algorithm. One limitation of BNP approaches can be the cost of Monte-Carlo sampling for inference. Small-variance asymptotic (SVA) approaches paves the way to much cheaper though approximate methods for inference by taking benefit from a fruitful interaction between Bayesian models and optimization algorithms. In brief, SVA lets the variance of the noise (or residual error) distribution tend to zero in the optimization problem corresponding to a MAP estimator with finite noise variance for instance. We propose such an SVA analysis of a BNP dictionary learning (DL) approach that automatically adapts the size of the dictionary or the subspace dimension in an efficient way. Numerical experiments illustrate the efficiency of the proposed method.
Liste complète des métadonnées
Contributor : Hong-Phuong Dang <>
Submitted on : Thursday, December 20, 2018 - 11:25:45 AM
Last modification on : Tuesday, April 2, 2019 - 2:27:15 AM
Document(s) archivé(s) le : Friday, March 22, 2019 - 12:22:41 PM


Files produced by the author(s)



Clément Elvira, Hong-Phuong Dang, Pierre Chainais. Small variance asymptotics and bayesian nonparametrics for dictionary learning. EUSIPCO 2018 - 26th European Signal Processing Conference, Sep 2018, Rome, Italy. pp.1607-1611, ⟨10.23919/EUSIPCO.2018.8553142⟩. ⟨hal-01961852⟩



Record views


Files downloads