Probabilistic learning inference of boundary value problem with uncertainties based on Kullback-Leibler divergence under implicit constraints - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue Computer Methods in Applied Mechanics and Engineering Année : 2022

Probabilistic learning inference of boundary value problem with uncertainties based on Kullback-Leibler divergence under implicit constraints

Résumé

This paper deals with a probabilistic learning inference that allows for integrating data (target set) into predictive models for which the target set is constituted of statistical moments of the quantity of interest (QoI) and for which the training set is constituted of a small number of points. In a first part, we present a mathematical analysis of a general methodology that allows for estimating a posterior probability model for a stochastic boundary value problem from a prior probability model of a control parameter. The given targets are statistical moments of the QoI for which the underlying realizations are not available. Under these conditions, the Kullback-Leibler minimum principle is used for estimating the posterior probability measure. The constraints are represented by an implicit nonlinear mapping for which a statistical surrogate model is introduced. The MCMC generator and the necessary numerical elements are given to facilitate the implementation of the methodology. In a second part, an application is presented to illustrate the proposed theory and is also, as such, a contribution to the three-dimensional stochastic homogenization of heterogeneous linear elastic media in the case of a non-separation of the microscale and macroscale. Consequently, the macroscale is another mesoscale at larger scale with random effective/apparent elastic properties. For the construction of the posterior probability measure by using the probabilistic learning inference, in addition to the constraints defined by given statistical moments of the random effective/apparent elasticity tensor, the second-order moment of the random normalized residue of the stochastic partial differential equation has been added as a constraint. This constraint guarantees that the algorithm seeks to bring the statistical moments closer to their targets while preserving a small residue.
Fichier principal
Vignette du fichier
publi-2022-CMAME-395()115078-soize-preprint.pdf (438.81 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03667277 , version 1 (13-05-2022)

Identifiants

Citer

Christian Soize. Probabilistic learning inference of boundary value problem with uncertainties based on Kullback-Leibler divergence under implicit constraints. Computer Methods in Applied Mechanics and Engineering, 2022, 395, pp.115058. ⟨10.1016/j.cma.2022.115078⟩. ⟨hal-03667277⟩
24 Consultations
35 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More