Exponential Convergence

Exponential convergence in various computational methods is a key research area focused on achieving rapid solution convergence, particularly in challenging optimization and machine learning problems. Current research investigates this convergence in diverse algorithms, including stochastic gradient descent (SGD) and its momentum variants, randomized Kaczmarz methods, and deep operator networks for solving partial differential equations. Demonstrating and improving exponential convergence rates is crucial for enhancing the efficiency and scalability of these methods across applications ranging from hyperparameter optimization to decentralized consensus and robust learning.

Papers