Skip to Main content Skip to Navigation
Preprints, Working Papers, ...

On the Existence of Optimal Transport Gradient for Learning Generative Models

Antoine Houdard 1 Arthur Leclaire 1 Nicolas Papadakis 1 Julien Rabin 2 
2 Equipe Image - Laboratoire GREYC - UMR6072
GREYC - Groupe de Recherche en Informatique, Image et Instrumentation de Caen
Abstract : The use of optimal transport cost for learning generative models has become popular with Wasserstein Generative Adversarial Networks (WGAN). Training of WGAN relies on a theoretical background: the calculation of the gradient of the optimal transport cost with respect to the generative model parameters. We first demonstrate that such gradient may not be defined, which can result in numerical instabilities during gradient-based optimization. We address this issue by stating a valid differentiation theorem in the case of entropic regularized transport and specify conditions under which existence is ensured. By exploiting the discrete nature of empirical data, we formulate the gradient in a semi-discrete setting and propose an algorithm for the optimization of the generative model parameters. Finally, we illustrate numerically the advantage of the proposed framework.
Document type :
Preprints, Working Papers, ...
Complete list of metadata

https://hal.archives-ouvertes.fr/hal-03137342
Contributor : Antoine Houdard Connect in order to contact the contributor
Submitted on : Wednesday, February 10, 2021 - 1:37:43 PM
Last modification on : Saturday, June 25, 2022 - 9:56:35 AM
Long-term archiving on: : Tuesday, May 11, 2021 - 6:36:16 PM

Files

main.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-03137342, version 1

Citation

Antoine Houdard, Arthur Leclaire, Nicolas Papadakis, Julien Rabin. On the Existence of Optimal Transport Gradient for Learning Generative Models. 2021. ⟨hal-03137342⟩

Share

Metrics

Record views

139

Files downloads

129