Beta Divergence
Beta divergence is a family of statistical measures quantifying the difference between probability distributions, encompassing well-known divergences like Kullback-Leibler and Hellinger distances. Current research focuses on leveraging beta divergences within various machine learning models, including generative adversarial networks (GANs), flow-based models, and restricted Boltzmann machines (RBMs), to improve robustness, particularly when dealing with heavy-tailed distributions or noisy data. This work addresses challenges in areas such as density estimation, unsupervised domain adaptation, and robust statistical inference, ultimately aiming for more accurate and reliable machine learning algorithms.
Papers
September 12, 2024
May 22, 2024
February 27, 2024
February 7, 2024
December 20, 2023
October 13, 2023
October 2, 2023
September 15, 2023
September 5, 2023
July 11, 2023
March 6, 2023
December 26, 2022
November 18, 2022
October 24, 2022
October 12, 2022
July 18, 2022
July 13, 2022
March 30, 2022