Skip to Main content Skip to Navigation
New interface
Conference papers

KALE Flow: A Relaxed KL Gradient Flow for Probabilities with Disjoint Support

Abstract : We study the gradient flow for a relaxed approximation to the Kullback-Leibler (KL) divergence between a moving source and a fixed target distribution. This approximation, termed the KALE (KL Approximate Lower bound Estimator), solves a regularized version of the Fenchel dual problem defining the KL over a restricted class of functions. When using a Reproducing Kernel Hilbert Space (RKHS) to define the function class, we show that the KALE continuously interpolates between the KL and the Maximum Mean Discrepancy (MMD). Like the MMD and other Integral Probability Metrics, the KALE remains well-defined for mutually singular distributions. Nonetheless, the KALE inherits from the limiting KL a greater sensitivity to mismatch in the support of the distributions, compared with the MMD. These two properties make the KALE gradient flow particularly well suited when the target distribution is supported on a low-dimensional manifold. Under an assumption of sufficient smoothness of the trajectories, we show the global convergence of the KALE flow. We propose a particle implementation of the flow given initial samples from the source and the target distribution, which we use to empirically confirm the KALE's properties.
Document type :
Conference papers
Complete list of metadata
Contributor : Michael Arbel Connect in order to contact the contributor
Submitted on : Monday, November 29, 2021 - 4:34:01 PM
Last modification on : Sunday, June 26, 2022 - 3:22:03 AM


Files produced by the author(s)


  • HAL Id : hal-03455473, version 1



Pierre Glaser, Michael Arbel, Arthur Gretton. KALE Flow: A Relaxed KL Gradient Flow for Probabilities with Disjoint Support. NeurIPS 2021 -Thirty-Fifth Annual Conference on Neural Information Processing Systems, Dec 2021, Online, France. pp.1-29. ⟨hal-03455473⟩



Record views


Files downloads