Certified dimension reduction in nonlinear Bayesian inverse problems - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue Mathematics of Computation Année : 2022

Certified dimension reduction in nonlinear Bayesian inverse problems

Résumé

We propose a dimension reduction technique for Bayesian inverse problems with nonlinear forward operators, non-Gaussian priors, and non-Gaussian observation noise. The likelihood function is approximated by a ridge function, i.e., a map which depends non-trivially only on a few linear combinations of the parameters. We build this ridge approximation by minimizing an upper bound on the Kullback-Leibler divergence between the posterior distribution and its approximation. This bound, obtained via logarithmic Sobolev inequalities, allows one to certify the error of the posterior approximation. Computing the bound requires computing the second moment matrix of the gradient of the log-likelihood function. In practice, a sample-based approximation of the upper bound is then required. We provide an analysis that enables control of the posterior approximation error due to this sampling. Numerical and theoretical comparisons with existing methods illustrate the benefits of the proposed methodology.
Fichier principal
Vignette du fichier
preprint.pdf (1.46 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-01834039 , version 1 (10-07-2018)
hal-01834039 , version 2 (09-03-2021)

Identifiants

Citer

Olivier Zahm, Tiangang Cui, Kody Law, Alessio Spantini, Youssef Marzouk. Certified dimension reduction in nonlinear Bayesian inverse problems. Mathematics of Computation, 2022, 91, pp.1789-1835. ⟨10.1090/mcom/3737⟩. ⟨hal-01834039v2⟩
337 Consultations
319 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More