Exponential Convergence
Exponential convergence in various computational methods is a key research area focused on achieving rapid solution convergence, particularly in challenging optimization and machine learning problems. Current research investigates this convergence in diverse algorithms, including stochastic gradient descent (SGD) and its momentum variants, randomized Kaczmarz methods, and deep operator networks for solving partial differential equations. Demonstrating and improving exponential convergence rates is crucial for enhancing the efficiency and scalability of these methods across applications ranging from hyperparameter optimization to decentralized consensus and robust learning.
Papers
November 6, 2024
November 4, 2024
May 24, 2024
December 22, 2023
February 7, 2023
April 18, 2022
February 24, 2022
February 22, 2022