Paper ID: 2307.10053

Stochastic Subgradient Methods with Guaranteed Global Stability in Nonsmooth Nonconvex Optimization

Nachuan Xiao, Xiaoyin Hu, Kim-Chuan Toh

In this paper, we focus on providing convergence guarantees for stochastic subgradient methods in minimizing nonsmooth nonconvex functions. We first investigate the global stability of a general framework for stochastic subgradient methods, where the corresponding differential inclusion admits a coercive Lyapunov function. We prove that, for any sequence of sufficiently small stepsizes and approximation parameters, coupled with sufficiently controlled noises, the iterates are uniformly bounded and asymptotically stabilize around the stable set of its corresponding differential inclusion. Moreover, we develop an improved analysis to apply our proposed framework to establish the global stability of a wide range of stochastic subgradient methods, where the corresponding Lyapunov functions are possibly non-coercive. These theoretical results illustrate the promising potential of our proposed framework for establishing the global stability of various stochastic subgradient methods.

Submitted: Jul 19, 2023