Divergence Estimation

Divergence estimation quantifies the difference between probability distributions, a crucial task across diverse fields. Current research focuses on developing robust and efficient divergence estimators, particularly for high-dimensional data and complex distributions, employing methods like Kullback-Leibler and Jensen-Shannon divergences, and incorporating techniques such as Gaussian smoothing and adaptive weighting schemes to improve accuracy and computational efficiency. These advancements have significant implications for various applications, including synthetic data validation, machine learning model training and evaluation, and signal processing in challenging environments.

Papers