Geometric Losses for Distributional Learning

Abstract : Building upon recent advances in entropy-regularized optimal transport, and upon Fenchel duality between measures and continuous functions , we propose a generalization of the logistic loss that incorporates a metric or cost between classes. Unlike previous attempts to use optimal transport distances for learning, our loss results in unconstrained convex objective functions, supports infinite (or very large) class spaces, and naturally defines a geometric generalization of the softmax operator. The geometric properties of this loss make it suitable for predicting sparse and singular distributions, for instance supported on curves or hyper-surfaces. We study the theoretical properties of our loss and show-case its effectiveness on two applications: ordinal regression and drawing generation.
Complete list of metadatas

https://hal.archives-ouvertes.fr/hal-02129281
Contributor : Arthur Mensch <>
Submitted on : Tuesday, May 14, 2019 - 5:56:36 PM
Last modification on : Thursday, May 16, 2019 - 1:46:53 AM

Files

article.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-02129281, version 1
  • ARXIV : 1905.06005

Collections

Citation

Arthur Mensch, Mathieu Blondel, Gabriel Peyré. Geometric Losses for Distributional Learning. Proceedings of the International Conference on Machine Learning, 2019, Long Beach, United States. ⟨hal-02129281⟩

Share

Metrics

Record views

541

Files downloads

231