SGD-QN: Careful Quasi-Newton Stochastic Gradient Descent - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue Journal of Machine Learning Research Année : 2009

SGD-QN: Careful Quasi-Newton Stochastic Gradient Descent

Résumé

The SGD-QN algorithm is a stochastic gradient descent algorithm that makes careful use of second-order information and splits the parameter update into independently scheduled components. Thanks to this design, SGD-QN iterates nearly as fast as a first-order stochastic gradient descent but requires less iterations to achieve the same accuracy. This algorithm won the "Wild Track" of the first PASCAL Large Scale Learning Challenge (Sonnenburg et al., 2008).
Fichier principal
Vignette du fichier
bordes09jmlr.pdf (146.25 Ko) Télécharger le fichier
Origine : Fichiers éditeurs autorisés sur une archive ouverte
Loading...

Dates et versions

hal-00750911 , version 1 (12-11-2012)

Identifiants

  • HAL Id : hal-00750911 , version 1

Citer

Antoine Bordes, Léon Bottou, Patrick Gallinari. SGD-QN: Careful Quasi-Newton Stochastic Gradient Descent. Journal of Machine Learning Research, 2009, 10, pp.1737-1754. ⟨hal-00750911⟩
236 Consultations
895 Téléchargements

Partager

Gmail Facebook X LinkedIn More