Bregman Divergence

Bregman divergence measures the difference between two points in a space defined by a convex function, finding applications across machine learning and statistics. Current research focuses on leveraging Bregman divergences within optimization algorithms like mirror descent and bilevel optimization, particularly for handling non-convex problems and improving convergence rates. This generalized distance metric is proving valuable in diverse areas, including density ratio estimation, clustering, and federated learning, by providing a flexible framework for handling various loss functions and data geometries. The resulting algorithms often demonstrate improved performance and theoretical guarantees compared to traditional methods.

Papers