From Bayesian Inference to MCMC and Convex Optimisation in Hadamard Manifolds
Résumé
The present work is motivated by the problem of Bayesian inference for Gaussian distributions in symmetric Hadamard spaces (that is, Hadamard manifolds which are also symmetric spaces). To investigate this problem, it introduces new tools for Markov Chain Monte Carlo, and convex optimisation: (1) it provides easy-to-verify sufficient conditions for the geometric ergodicity of an isotropic Metropolis-Hastings Markov chain, in a symmetric Hadamard space, (2) it shows how the Riemannian gradient descent method can achieve an exponential rate of convergence, when applied to a strongly convex function, on a Hadamard manifold. Using these tools, two Bayesian estimators, maximum-a-posteriori and minimum-mean-squares, are compared. When the underlying Hadamard manifold is a space of constant negative curvature, they are found to be surprisingly close to each other. This leads to an open problem: are these two estimators, in fact, equal (assuming constant negative curvature)?