Stochastic Optimization for Large-scale Optimal Transport

Abstract : Optimal transport (OT) defines a powerful framework to compare probability distributions in a geometrically faithful way. However, the practical impact of OT is still limited because of its computational burden. We propose a new class of stochastic optimization algorithms to cope with large-scale problems routinely encountered in machine learning applications. These methods are able to manipulate arbitrary distributions (either discrete or continuous) by simply requiring to be able to draw samples from them, which is the typical setup in high-dimensional learning problems. This alleviates the need to discretize these densities, while giving access to provably convergent methods that output the correct distance without discretization error. These algorithms rely on two main ideas: (a) the dual OT problem can be re-cast as the maximization of an expectation ; (b) entropic regularization of the primal OT problem results in a smooth dual optimization optimization which can be addressed with algorithms that have a provably faster convergence. We instantiate these ideas in three different setups: (i) when comparing a discrete distribution to another, we show that incremental stochastic optimization schemes can beat Sinkhorn's algorithm, the current state-of-the-art finite dimensional OT solver; (ii) when comparing a discrete distribution to a continuous density, a semi-discrete reformulation of the dual program is amenable to averaged stochastic gradient descent, leading to better performance than approximately solving the problem by discretization ; (iii) when dealing with two continuous densities, we propose a stochastic gradient descent over a reproducing kernel Hilbert space (RKHS). This is currently the only known method to solve this problem, apart from computing OT on finite samples. We backup these claims on a set of discrete, semi-discrete and continuous benchmark problems.
Type de document :
Communication dans un congrès
NIPS. NIPS 2016 - Thirtieth Annual Conference on Neural Information Processing System, Dec 2016, Barcelona, Spain. Proc. NIPS 2016. 〈https://nips.cc/〉
Liste complète des métadonnées

Littérature citée [24 références]  Voir  Masquer  Télécharger

https://hal.archives-ouvertes.fr/hal-01321664
Contributeur : Gabriel Peyré <>
Soumis le : mercredi 2 novembre 2016 - 21:45:26
Dernière modification le : mercredi 12 septembre 2018 - 01:28:22
Document(s) archivé(s) le : vendredi 3 février 2017 - 16:28:32

Fichiers

StochasticOT-NIPS.pdf
Fichiers produits par l'(les) auteur(s)

Identifiants

  • HAL Id : hal-01321664, version 2
  • ARXIV : 1605.08527

Citation

Aude Genevay, Marco Cuturi, Gabriel Peyré, Francis Bach. Stochastic Optimization for Large-scale Optimal Transport. NIPS. NIPS 2016 - Thirtieth Annual Conference on Neural Information Processing System, Dec 2016, Barcelona, Spain. Proc. NIPS 2016. 〈https://nips.cc/〉. 〈hal-01321664v2〉

Partager

Métriques

Consultations de la notice

1238

Téléchargements de fichiers

1159