R\'enyi Divergence

Rényi divergence is a generalized measure of the difference between probability distributions, offering a flexible framework encompassing other divergences like Kullback-Leibler divergence as special cases. Current research focuses on leveraging Rényi divergence in various applications, including improving the efficiency of sampling algorithms for high-dimensional data, enhancing the robustness of machine learning models (e.g., through semi-supervised learning and contrastive learning), and providing stronger privacy guarantees in differentially private algorithms. This versatility makes Rényi divergence a valuable tool for addressing challenges in diverse fields, from theoretical computer science and statistics to machine learning and data privacy.

Papers