F$ Divergence

F-divergences are a family of statistical measures quantifying the difference between two probability distributions, finding broad application in machine learning and information theory. Current research focuses on leveraging f-divergences in density ratio estimation, developing novel algorithms like Wasserstein gradient flows and (de)-regularized Maximum Mean Discrepancy flows for improved convergence and tractability, and exploring their use in various applications such as out-of-distribution generalization and reinforcement learning. These advancements offer refined tools for analyzing and manipulating probability distributions, leading to improved theoretical understanding and practical performance in diverse machine learning tasks, including generative modeling and robust optimization.

Papers