Proximal Subgradient

Proximal subgradient methods are iterative algorithms used to solve optimization problems, particularly those involving non-smooth or non-Lipschitz functions, which are common in various applications. Current research focuses on improving convergence rates, especially for strongly convex functions, through techniques like stochastic methods and novel analyses of existing algorithms such as ISTA and FISTA. These advancements aim to enhance the efficiency and applicability of proximal subgradient methods in diverse fields, including machine learning, signal processing, and image reconstruction, by providing faster and more robust solutions to challenging optimization problems. A key area of investigation involves understanding and improving convergence guarantees beyond traditional Lipschitz continuity assumptions.

Papers