Subgradient Method

The subgradient method is a fundamental algorithm for solving optimization problems, particularly those involving non-smooth (non-differentiable) objective functions. Current research focuses on extending its applicability to complex settings, including decentralized optimization over time-varying networks, problems with singularities, and non-convex or weakly convex objectives, often employing techniques like proximal methods and Lagrangian approaches. These advancements improve the efficiency and robustness of subgradient methods, impacting diverse fields such as machine learning (e.g., network slimming, support vector machines), and offering theoretical insights into convergence rates and complexity bounds for various problem classes.

Papers