On the Variational Posterior of Dirichlet Process Deep Latent Gaussian Mixture Models - Equipe Math & Net Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2020

On the Variational Posterior of Dirichlet Process Deep Latent Gaussian Mixture Models

Joachim Flocon-Cholet
  • Fonction : Auteur
  • PersonId : 772934
  • IdRef : 196443776
Stéphane Gosselin
Sandrine Vaton

Résumé

Thanks to the reparameterization trick, deep latent Gaussian models have shown tremendous success recently in learning latent representations. The ability to couple them however with nonparamet-ric priors such as the Dirichlet Process (DP) hasn't seen similar success due to its non parameteriz-able nature. In this paper, we present an alternative treatment of the variational posterior of the Dirichlet Process Deep Latent Gaussian Mixture Model (DP-DLGMM), where we show that the prior cluster parameters and the variational posteriors of the beta distributions and cluster hidden variables can be updated in closed-form. This leads to a standard reparameterization trick on the Gaussian latent variables knowing the cluster assignments. We demonstrate our approach on standard benchmark datasets, we show that our model is capable of generating realistic samples for each cluster obtained, and manifests competitive performance in a semi-supervised setting.
Fichier principal
Vignette du fichier
DirichletProcessDLGMMs.pdf (742.68 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-02864385 , version 1 (15-06-2020)
hal-02864385 , version 2 (24-07-2020)

Identifiants

Citer

Amine Echraibi, Joachim Flocon-Cholet, Stéphane Gosselin, Sandrine Vaton. On the Variational Posterior of Dirichlet Process Deep Latent Gaussian Mixture Models. 2020. ⟨hal-02864385v1⟩
258 Consultations
209 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More