SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly Convex Composite Objectives - Archive ouverte HAL Accéder directement au contenu
Rapport Année : 2014

SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly Convex Composite Objectives

Résumé

In this work we introduce a new optimisation method called SAGA in the spirit of SAG, SDCA, MISO and SVRG, a set of recently proposed incremental gradient algorithms with fast linear convergence rates. SAGA improves on the theory behind SAG and SVRG, with better theoretical convergence rates, and has support for composite objectives where a proximal operator is used on the regulariser. Unlike SDCA, SAGA supports non-strongly convex problems directly, and is adaptive to any inherent strong convexity of the problem. We give experimental results showing the effectiveness of our method.
Fichier principal
Vignette du fichier
hal_saga.pdf (159.52 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-01016843 , version 1 (01-07-2014)
hal-01016843 , version 2 (17-07-2014)
hal-01016843 , version 3 (12-11-2014)

Identifiants

Citer

Aaron Defazio, Francis Bach, Simon Lacoste-Julien. SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly Convex Composite Objectives. 2014. ⟨hal-01016843v1⟩
909 Consultations
859 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More