DADAO: Decoupled Accelerated Decentralized Asynchronous Optimization - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2023

DADAO: Decoupled Accelerated Decentralized Asynchronous Optimization

Résumé

This work introduces DADAO: the first decentralized, accelerated, asynchronous, primal, first-order algorithm to minimize a sum of $L$-smooth and $\mu$-strongly convex functions distributed over a given network of size $n$. Our key insight is based on modeling the local gradient updates and gossip communication procedures with separate independent Poisson Point Processes. This allows us to decouple the computation and communication steps, which can be run in parallel, while making the whole approach completely asynchronous. This leads to communication acceleration compared to synchronous approaches. Our new method employs primal gradients and does not use a multi-consensus inner loop nor other ad-hoc mechanisms such as Error Feedback, Gradient Tracking, or a Proximal operator. By relating the inverse of the smallest positive eigenvalue of the Laplacian matrix $\chi_1$ and the maximal resistance $\chi_2\leq \chi_1$ of the graph to a sufficient minimal communication rate between the nodes of the network, we show that our algorithm requires $\mathcal{O}(n\sqrt{\frac{L}{\mu}}\log(\frac{1}{\epsilon}))$ local gradients and only $\mathcal{O}(n\sqrt{\chi_1\chi_2}\sqrt{\frac{L}{\mu}}\log(\frac{1}{\epsilon}))$ communications to reach a precision $\epsilon$, up to logarithmic terms. Thus, we simultaneously obtain an accelerated rate for both computations and communications, leading to an improvement over state-of-the-art works, our simulations further validating the strength of our relatively unconstrained method.
Fichier principal
Vignette du fichier
DADAO_ICML.pdf (435.64 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03737694 , version 1 (25-07-2022)
hal-03737694 , version 2 (14-02-2023)
hal-03737694 , version 3 (15-11-2023)

Identifiants

Citer

Adel Nabli, Edouard Oyallon. DADAO: Decoupled Accelerated Decentralized Asynchronous Optimization. International Conference on Machine Learning, Jul 2023, Honolulu, United States. ⟨hal-03737694v3⟩
532 Consultations
245 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More