Bregman Proximal

Bregman proximal methods are optimization algorithms leveraging Bregman divergences—generalizations of squared Euclidean distance—to solve challenging problems across diverse fields. Current research focuses on improving the efficiency and convergence guarantees of these methods, particularly within the contexts of gradient boosting machines, proximal gradient methods, and alternating minimization schemes for applications like optimal transport and image processing. This work addresses limitations of existing approaches, such as spurious stationary points and computational cost, leading to more robust and efficient solutions for problems in machine learning, computer vision, and other areas requiring complex optimization.

Papers