Paper ID: 2404.09617

Safeguarding adaptive methods: global convergence of Barzilai-Borwein and other stepsize choices

Hongjia Ou, Andreas Themelis

Leveraging on recent advancements on adaptive methods for convex minimization problems, this paper provides a linesearch-free proximal gradient framework for globalizing the convergence of popular stepsize choices such as Barzilai-Borwein and one-dimensional Anderson acceleration. This framework can cope with problems in which the gradient of the differentiable function is merely locally H\"older continuous. Our analysis not only encompasses but also refines existing results upon which it builds. The theory is corroborated by numerical evidence that showcases the synergetic interplay between fast stepsize selections and adaptive methods.

Submitted: Apr 15, 2024