Skip to Main content Skip to Navigation
Conference papers

A Distributed Frank-Wolfe Algorithm for Communication-Efficient Sparse Learning

Abstract : Learning sparse combinations is a frequent theme in machine learning. In this paper, we study its associated optimization problem in the distributed setting where the elements to be combined are not centrally located but spread over a network. We address the key challenges of balancing communication costs and optimization errors. To this end, we propose a distributed Frank-Wolfe (dFW) algorithm. We obtain theoretical guarantees on the optimization error and communication cost that do not depend on the total number of combining elements. We further show that the communication cost of dFW is optimal by deriving a lower-bound on the communication cost required to construct an-approximate solution. We validate our theoretical analysis with empirical studies on synthetic and real-world data, which demonstrate that dFW outperforms both baselines and competing methods. We also study the performance of dFW when the conditions of our analysis are relaxed, and show that dFW is fairly robust.
Complete list of metadatas

Cited literature [36 references]  Display  Hide  Download

https://hal.inria.fr/hal-01430851
Contributor : Aurélien Bellet <>
Submitted on : Tuesday, January 10, 2017 - 12:43:30 PM
Last modification on : Thursday, March 5, 2020 - 3:55:53 PM
Document(s) archivé(s) le : Tuesday, April 11, 2017 - 2:37:01 PM

File

sdm15.pdf
Files produced by the author(s)

Identifiers

Collections

Citation

Aurélien Bellet, Yingyu Liang, Alireza Bagheri Garakani, Maria-Florina Balcan, Fei Sha. A Distributed Frank-Wolfe Algorithm for Communication-Efficient Sparse Learning. SIAM International Conference on Data Mining (SDM 2015), Apr 2015, Vancouver, Canada. ⟨10.1137/1.9781611974010.54⟩. ⟨hal-01430851⟩

Share

Metrics

Record views

200

Files downloads

225