HAL will be down for maintenance from Friday, June 10 at 4pm through Monday, June 13 at 9am. More information
Skip to Main content Skip to Navigation
Conference papers

Geometric Losses for Distributional Learning

Abstract : Building upon recent advances in entropy-regularized optimal transport, and upon Fenchel duality between measures and continuous functions , we propose a generalization of the logistic loss that incorporates a metric or cost between classes. Unlike previous attempts to use optimal transport distances for learning, our loss results in unconstrained convex objective functions, supports infinite (or very large) class spaces, and naturally defines a geometric generalization of the softmax operator. The geometric properties of this loss make it suitable for predicting sparse and singular distributions, for instance supported on curves or hyper-surfaces. We study the theoretical properties of our loss and show-case its effectiveness on two applications: ordinal regression and drawing generation.
Complete list of metadata

Contributor : Arthur Mensch Connect in order to contact the contributor
Submitted on : Tuesday, May 14, 2019 - 5:56:36 PM
Last modification on : Thursday, March 17, 2022 - 10:08:19 AM


Files produced by the author(s)


  • HAL Id : hal-02129281, version 1
  • ARXIV : 1905.06005



Arthur Mensch, Mathieu Blondel, Gabriel Peyré. Geometric Losses for Distributional Learning. Proceedings of the International Conference on Machine Learning, 2019, Long Beach, United States. ⟨hal-02129281⟩



Record views


Files downloads