On the Variational Posterior of Dirichlet Process Deep Latent Gaussian Mixture Models - Equipe Math & Net Accéder directement au contenu
Communication Dans Un Congrès Année : 2020

On the Variational Posterior of Dirichlet Process Deep Latent Gaussian Mixture Models

Résumé

Thanks to the reparameterization trick, deep latent Gaussian models have shown tremendous success recently in learning latent representations. The ability to couple them however with nonparamet-ric priors such as the Dirichlet Process (DP) hasn't seen similar success due to its non parameteriz-able nature. In this paper, we present an alternative treatment of the variational posterior of the Dirichlet Process Deep Latent Gaussian Mixture Model (DP-DLGMM), where we show that the prior cluster parameters and the variational posteriors of the beta distributions and cluster hidden variables can be updated in closed-form. This leads to a standard reparameterization trick on the Gaussian latent variables knowing the cluster assignments. We demonstrate our approach on standard benchmark datasets, we show that our model is capable of generating realistic samples for each cluster obtained, and manifests competitive performance in a semi-supervised setting.
Fichier principal
Vignette du fichier
DirichletProcessDLGMs.pdf (885.21 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-02864385 , version 1 (15-06-2020)
hal-02864385 , version 2 (24-07-2020)

Identifiants

Citer

Amine Echraibi, Joachim Flocon-Cholet, Stéphane Gosselin, Sandrine Vaton. On the Variational Posterior of Dirichlet Process Deep Latent Gaussian Mixture Models. ICML Workshop on Invertible Neural Networks, Normalizing Flows, and Explicit Likelihood Models, Jul 2020, Vienna, Austria. ⟨hal-02864385v2⟩
258 Consultations
208 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More