Efficient Approximate Inference with Walsh-Hadamard Variational Inference - Archive ouverte HAL Accéder directement au contenu
Poster De Conférence Année : 2019

Efficient Approximate Inference with Walsh-Hadamard Variational Inference

Simone Rossi
  • Fonction : Auteur
Sebastien Marmin
  • Fonction : Auteur
Maurizio Filippone
  • Fonction : Auteur
  • PersonId : 1021042

Résumé

Variational inference offers scalable and flexible tools to tackle intractable Bayesian inference of modern statistical models like Bayesian neural networks and Gaussian processes. For largely over-parameterized models, however, the over-regularization property of the variational objective makes the application of variational inference challenging. Inspired by the literature on kernel methods, and in particular on structured approximations of distributions of random matrices, this paper proposes Walsh-Hadamard Variational Inference, which uses Walsh-Hadamard-based factorization strategies to reduce model parameterization, accelerate computations, and increase the expressiveness of the approximate posterior beyond fully factorized ones.

Dates et versions

hal-02987977 , version 1 (04-11-2020)

Identifiants

Citer

Simone Rossi, Sebastien Marmin, Maurizio Filippone. Efficient Approximate Inference with Walsh-Hadamard Variational Inference. 4th Workshop on Bayesian Deep Learning (NeurIPS 2019), Dec 2019, Vancouver, Canada. ⟨hal-02987977⟩
52 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More