Skip to Main content Skip to Navigation
Conference papers

On the Variational Posterior of Dirichlet Process Deep Latent Gaussian Mixture Models

Amine Echraibi 1, 2, 3 Joachim Flocon-Cholet 2 Stéphane Gosselin 2 Sandrine Vaton 1, 3
3 Lab-STICC_MATHNET - Equipe Math & Net
Lab-STICC - Laboratoire des sciences et techniques de l'information, de la communication et de la connaissance : UMR6285
Abstract : Thanks to the reparameterization trick, deep latent Gaussian models have shown tremendous success recently in learning latent representations. The ability to couple them however with nonparamet-ric priors such as the Dirichlet Process (DP) hasn't seen similar success due to its non parameteriz-able nature. In this paper, we present an alternative treatment of the variational posterior of the Dirichlet Process Deep Latent Gaussian Mixture Model (DP-DLGMM), where we show that the prior cluster parameters and the variational posteriors of the beta distributions and cluster hidden variables can be updated in closed-form. This leads to a standard reparameterization trick on the Gaussian latent variables knowing the cluster assignments. We demonstrate our approach on standard benchmark datasets, we show that our model is capable of generating realistic samples for each cluster obtained, and manifests competitive performance in a semi-supervised setting.
Complete list of metadata
Contributor : Amine Echraibi Connect in order to contact the contributor
Submitted on : Friday, July 24, 2020 - 10:36:50 AM
Last modification on : Wednesday, November 3, 2021 - 6:14:58 AM


Files produced by the author(s)


  • HAL Id : hal-02864385, version 2
  • ARXIV : 2006.08993


Amine Echraibi, Joachim Flocon-Cholet, Stéphane Gosselin, Sandrine Vaton. On the Variational Posterior of Dirichlet Process Deep Latent Gaussian Mixture Models. ICML Workshop on Invertible Neural Networks, Normalizing Flows, and Explicit Likelihood Models, Jul 2020, Vienna, Austria. ⟨hal-02864385v2⟩



Les métriques sont temporairement indisponibles