Optimal Algorithms for Non-Smooth Distributed Optimization in Networks

Kevin Scaman 1 Francis Bach 2 Sébastien Bubeck 3 Yin Tat Lee 3 Laurent Massoulié 4
2 SIERRA - Statistical Machine Learning and Parsimony
DI-ENS - Département d'informatique de l'École normale supérieure, CNRS - Centre National de la Recherche Scientifique, Inria de Paris
4 DYOGENE - Dynamics of Geometric Networks
DI-ENS - Département d'informatique de l'École normale supérieure, CNRS - Centre National de la Recherche Scientifique : UMR 8548, Inria de Paris
Abstract : In this work, we consider the distributed optimization of non-smooth convex functions using a network of computing units. We investigate this problem under two regularity assumptions: (1) the Lipschitz continuity of the global objective function, and (2) the Lipschitz continuity of local individual functions. Under the local regularity assumption, we provide the first optimal first-order decentralized algorithm called multi-step primal-dual (MSPD) and its corresponding optimal convergence rate. A notable aspect of this result is that, for non-smooth functions, while the dominant term of the error is in $O(1/\sqrt{t})$, the structure of the communication network only impacts a second-order term in $O(1/t)$, where $t$ is time. In other words, the error due to limits in communication resources decreases at a fast rate even in the case of non-strongly-convex objective functions. Under the global regularity assumption, we provide a simple yet efficient algorithm called distributed randomized smoothing (DRS) based on a local smoothing of the objective function, and show that DRS is within a $d^{1/4}$ multiplicative factor of the optimal convergence rate, where $d$ is the underlying dimension.
Document type :
Conference papers
Complete list of metadatas

https://hal.archives-ouvertes.fr/hal-01957013
Contributor : Francis Bach <>
Submitted on : Monday, December 17, 2018 - 8:11:09 AM
Last modification on : Tuesday, February 12, 2019 - 12:48:03 PM

Links full text

Identifiers

  • HAL Id : hal-01957013, version 1
  • ARXIV : 1806.00291

Collections

Citation

Kevin Scaman, Francis Bach, Sébastien Bubeck, Yin Tat Lee, Laurent Massoulié. Optimal Algorithms for Non-Smooth Distributed Optimization in Networks. Advances In Neural Information Processing systems, Dec 2018, Montreal, Canada. ⟨hal-01957013⟩

Share

Metrics

Record views

35