Composite Optimization
Composite optimization focuses on solving optimization problems with objective functions composed of smooth and non-smooth terms, often incorporating constraints. Current research emphasizes developing efficient first-order methods, including proximal gradient methods and primal-dual algorithms like ADMM and its variants, to handle various problem structures, such as multi-block problems, non-convexity, and stochasticity, often within federated learning settings. These advancements improve the scalability and convergence rates of algorithms for diverse applications, ranging from machine learning model training (including deep learning and robust models) to statistical recovery problems. The resulting improvements in efficiency and robustness have significant implications for large-scale data analysis and optimization tasks.