Highly-Smooth Zero-th Order Online Optimization Vianney Perchet

Francis Bach 1, 2 Vianney Perchet 3
1 SIERRA - Statistical Machine Learning and Parsimony
DI-ENS - Département d'informatique de l'École normale supérieure, CNRS - Centre National de la Recherche Scientifique, Inria de Paris
Abstract : The minimization of convex functions which are only available through partial and noisy information is a key methodological problem in many disciplines. In this paper we consider convex optimization with noisy zero-th order information, that is noisy function evaluations at any desired point. We focus on problems with high degrees of smoothness, such as logistic regression. We show that as opposed to gradient-based algorithms, high-order smoothness may be used to improve estimation rates, with a precise dependence of our upper-bounds on the degree of smoothness. In particular, we show that for infinitely differentiable functions, we recover the same dependence on sample size as gradient-based algorithms, with an extra dimension-dependent factor. This is done for both convex and strongly-convex functions, with finite horizon and anytime algorithms. Finally, we also recover similar results in the online optimization setting.
Type de document :
Communication dans un congrès
Conference on Learning Theory (COLT), Jun 2016, New York, United States. 2016
Liste complète des métadonnées

Littérature citée [25 références]  Voir  Masquer  Télécharger

Contributeur : Francis Bach <>
Soumis le : mercredi 25 mai 2016 - 22:10:44
Dernière modification le : jeudi 7 février 2019 - 15:49:47
Document(s) archivé(s) le : vendredi 26 août 2016 - 11:06:28


Fichiers produits par l'(les) auteur(s)


  • HAL Id : hal-01321532, version 1
  • ARXIV : 1605.08165


Francis Bach, Vianney Perchet. Highly-Smooth Zero-th Order Online Optimization Vianney Perchet. Conference on Learning Theory (COLT), Jun 2016, New York, United States. 2016. 〈hal-01321532〉



Consultations de la notice


Téléchargements de fichiers