Data-Targeted Prior Distribution for Variational AutoEncoder - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue Fluids Année : 2021

Data-Targeted Prior Distribution for Variational AutoEncoder

Nissrine Akkari
  • Fonction : Auteur
  • PersonId : 772337
  • IdRef : 179237098
Fabien Casenave
Thomas Daniel
  • Fonction : Auteur
David Ryckelynck

Résumé

Bayesian methods were studied in this paper using deep neural networks. We are interested in variational autoencoders, where an encoder approaches the true posterior and the decoder approaches the direct probability. Specifically, we applied these autoencoders for unsteady and compressible fluid flows in aircraft engines. We used inferential methods to compute a sharp approximation of the posterior probability of these parameters with the transient dynamics of the training velocity fields and to generate plausible velocity fields. An important application is the initialization of transient numerical simulations of unsteady fluid flows and large eddy simulations in fluid dynamics. It is known by the Bayes theorem that the choice of the prior distribution is very important for the computation of the posterior probability, proportional to the product of likelihood with the prior probability. Hence, we propose a new inference model based on a new prior defined by the density estimate with the realizations of the kernel proper orthogonal decomposition coefficients of the available training data. We numerically show that this inference model improves the results obtained with the usual standard normal prior distribution. This inference model was constructed using a new algorithm improving the convergence of the parametric optimization of the encoder probability distribution that approaches the posterior. This latter probability distribution is data-targeted, similarly to the prior distribution. This new generative approach can also be seen as an improvement of the kernel proper orthogonal decomposition method, for which we do not usually have a robust technique for expressing the pre-image in the input physical space of the stochastic reduced field in the feature high-dimensional space with a kernel inner product.

Dates et versions

hal-03359211 , version 1 (30-09-2021)

Identifiants

Citer

Nissrine Akkari, Fabien Casenave, Thomas Daniel, David Ryckelynck. Data-Targeted Prior Distribution for Variational AutoEncoder. Fluids, 2021, 6 (10), pp.343. ⟨10.3390/fluids6100343⟩. ⟨hal-03359211⟩
30 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More