Skip to Main content Skip to Navigation
New interface

Some Problems in Nonconvex Stochastic Optimization

Abstract : The subject of this thesis is the analysis of several stochastic algorithms in a nonconvex setting. The aim is to prove and characterize their convergence. First, we study a smooth optimization problem, analyzing a family of adaptive algorithms with momentum which includes the widely used ADAM and Nesterov's accelerated gradient descent. Convergence and fluctuation of the iterates are established. A general avoidance of traps result for stochastic algorithms underlined by a nonautonomous differential equation is presented and applied to establish the nonconvergence of the iterates to saddle points.The rest of the manuscript is devoted to the case where the function that we seek to minimize is nonsmooth. Most of our results in this part apply to functions definable in an o-minimal structure. Firstly, we analyze the constant step version of the stochastic subgradient descent (SGD) and show that the iterates converge with high probability to the set of critical points. Secondly, we show that every critical point of a generic, definable, locally Lipschitz continuous function is lying on an active manifold, satisfying a Verdier and an angle condition and is either a local minimum, an active strict saddle or a sharply repulsive critical point. We show that, under mild conditions on the perturbation sequence, the SGD escapes active strict saddles and sharply repulsive critical points. An improvement of the projection formula for definable functions, giving a Lipschitz-like condition on its Clarke subgradients is presented and is of independent interest. Finally, we establish an oscillation phenomena of the iterates of the SGD and its proximal extensions
Document type :
Complete list of metadata
Contributor : ABES STAR :  Contact
Submitted on : Saturday, June 18, 2022 - 6:24:20 AM
Last modification on : Friday, June 24, 2022 - 4:05:30 AM


Version validated by the jury (STAR)


  • HAL Id : tel-03698454, version 1


Sholom Schechtman. Some Problems in Nonconvex Stochastic Optimization. Numerical Analysis [math.NA]. Université Gustave Eiffel, 2021. English. ⟨NNT : 2021UEFL2031⟩. ⟨tel-03698454⟩



Record views


Files downloads