Stochastic mirror descent for variationally coherent optimization problems

Abstract : In this paper, we examine a class of non-convex stochastic optimization problems which we call \emph{variationally coherent}, and which properly includes pseudo-/quasi-convex and star-convex optimization problems. To solve such problems, we focus on the widely used \ac{SMD} family of algorithms (which contains stochastic gradient descent as a special case), and we show that the last iterate of \ac{SMD} converges to the problem's solution set with probability 1. This result contributes to the landscape of non-convex stochastic optimization by clarifying that neither pseudo-/quasi-convexity nor star-convexity is essential for (almost sure) global convergence; rather, variational coherence, a much weaker requirement, suffices. Characterization of convergence rates for the subclass of strongly variationally coherent optimization problems as well as simulation results are also presented.
Document type :
Conference papers
Complete list of metadatas
Contributor : Panayotis Mertikopoulos <>
Submitted on : Tuesday, November 21, 2017 - 1:11:43 PM
Last modification on : Friday, October 25, 2019 - 1:22:36 AM


  • HAL Id : hal-01643342, version 1


Zhengyuan Zhou, Panayotis Mertikopoulos, Nicholas Bambos, Stephen Boyd, Peter W. Glynn. Stochastic mirror descent for variationally coherent optimization problems. NIPS '17: Proceedings of the 31st International Conference on Neural Information Processing Systems, Dec 2017, Long Beach, CA, United States. ⟨hal-01643342⟩



Record views