Paper ID: 2303.15740

Concentration of Contractive Stochastic Approximation: Additive and Multiplicative Noise

Zaiwei Chen, Siva Theja Maguluri, Martin Zubeldia

In this paper, we establish maximal concentration bounds for the iterates generated by a stochastic approximation (SA) algorithm under a contractive operator with respect to some arbitrary norm (for example, the $\ell_\infty$-norm). We consider two settings where the iterates are potentially unbounded: SA with bounded multiplicative noise and SA with sub-Gaussian additive noise. Our maximal concentration inequalities state that the convergence error has a sub-Gaussian tail in the additive noise setting and a Weibull tail (which is faster than polynomial decay but could be slower than exponential decay) in the multiplicative noise setting. In addition, we provide an impossibility result showing that it is generally impossible to have sub-exponential tails under multiplicative noise. To establish the maximal concentration bounds, we develop a novel bootstrapping argument that involves bounding the moment-generating function of a modified version of the generalized Moreau envelope of the convergence error and constructing an exponential supermartingale to enable using Ville's maximal inequality. We demonstrate the applicability of our theoretical results in the context of linear SA and reinforcement learning.

Submitted: Mar 28, 2023