Beta Divergence

Beta divergence is a family of statistical measures quantifying the difference between probability distributions, encompassing well-known divergences like Kullback-Leibler and Hellinger distances. Current research focuses on leveraging beta divergences within various machine learning models, including generative adversarial networks (GANs), flow-based models, and restricted Boltzmann machines (RBMs), to improve robustness, particularly when dealing with heavy-tailed distributions or noisy data. This work addresses challenges in areas such as density estimation, unsupervised domain adaptation, and robust statistical inference, ultimately aiming for more accurate and reliable machine learning algorithms.

Papers