# A Unified View of Stochastic Hamiltonian Sampling

Abstract : In this work, we revisit the theoretical properties of Hamiltonian stochastic differential equations (SDEs) for Bayesian posterior sampling, and we study the two types of errors that arise from numerical SDE simulation: the discretization error and the error due to noisy gradient estimates in the context of data subsampling. We consider overlooked results describing the ergodic convergence rates of numerical integration schemes, and we produce a novel analysis for the effect of mini-batches through the lens of differential operator splitting. In our analysis, the stochastic component of the proposed Hamiltonian SDE is decoupled from the gradient noise, for which we make no normality assumptions. This allows us to derive interesting connections among different sampling schemes, including the original Hamiltonian Monte Carlo (HMC) algorithm, and explain their performance. We show that for a careful selection of numerical integrators, both errors vanish at a rate $\mathcal{O}(\eta^2)$, where $\eta$ is the integrator step size. Our theoretical results are supported by an empirical study on a variety of regression and classification tasks for Bayesian neural networks.
Document type :
Preprints, Working Papers, ...

https://hal.archives-ouvertes.fr/hal-03344742
Contributor : Centre de Documentation Eurecom Connect in order to contact the contributor
Submitted on : Wednesday, September 15, 2021 - 11:12:25 AM
Last modification on : Thursday, September 16, 2021 - 3:40:44 AM

### Identifiers

• HAL Id : hal-03344742, version 1
• ARXIV : 2106.16200

### Citation

Giulio Franzese, Dimitrios Milios, Maurizio Filippone, Pietro Michiardi. A Unified View of Stochastic Hamiltonian Sampling. 2021. ⟨hal-03344742⟩

### Metrics

Les métriques sont temporairement indisponibles