Accelerated Proximal

Accelerated proximal methods are optimization algorithms designed to efficiently solve problems involving complex objective functions, often incorporating regularization terms and constraints. Current research focuses on improving the convergence rates of these methods, particularly for non-convex and minimax problems, through techniques like damped step sizes, inertial proximal updates, and relaxed error criteria within algorithms such as APPA (Catalyst) and variants of alternating gradient descent-ascent. These advancements are impacting diverse fields, including machine learning (e.g., training neural ODEs and adversarial models) and signal processing (e.g., energy disaggregation), by enabling faster and more robust solutions to challenging optimization tasks. The development of more efficient proximal methods continues to be a significant area of research, driving improvements in the scalability and performance of many applications.

Papers