SGD-QN: Careful Quasi-Newton Stochastic Gradient Descent

Abstract : The SGD-QN algorithm is a stochastic gradient descent algorithm that makes careful use of second-order information and splits the parameter update into independently scheduled components. Thanks to this design, SGD-QN iterates nearly as fast as a first-order stochastic gradient descent but requires less iterations to achieve the same accuracy. This algorithm won the "Wild Track" of the first PASCAL Large Scale Learning Challenge (Sonnenburg et al., 2008).
Document type :
Journal articles
Complete list of metadatas

Cited literature [16 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-00750911
Contributor : Antoine Bordes <>
Submitted on : Monday, November 12, 2012 - 4:32:43 PM
Last modification on : Friday, August 9, 2019 - 2:05:06 PM
Long-term archiving on : Wednesday, February 13, 2013 - 3:46:33 AM

File

bordes09jmlr.pdf
Publisher files allowed on an open archive

Identifiers

  • HAL Id : hal-00750911, version 1

Citation

Antoine Bordes, Léon Bottou, Patrick Gallinari. SGD-QN: Careful Quasi-Newton Stochastic Gradient Descent. Journal of Machine Learning Research, Microtome Publishing, 2009, 10, pp.1737-1754. ⟨hal-00750911⟩

Share

Metrics

Record views

395

Files downloads

1136