Accelerated Algorithm

Accelerated algorithms aim to improve the efficiency of optimization methods used in diverse applications, from machine learning to large-scale distributed systems. Current research focuses on developing and analyzing accelerated algorithms for various problem structures, including stochastic bilevel optimization, minimax optimization, and problems involving non-convex and non-smooth functions; techniques like Nesterov acceleration, Anderson acceleration, and novel approaches like element-wise RSAV are being explored and refined. These advancements are crucial for tackling increasingly complex optimization challenges in areas like graph neural networks and sparse optimization, leading to faster training times and improved scalability for real-world applications.

Papers