Stein Discrepancy
Stein Discrepancy (SD) is a powerful tool for measuring the distance between probability distributions, particularly useful when dealing with unnormalized densities. Current research focuses on improving the efficiency and applicability of kernelized Stein discrepancy (KSD), a popular variant, through techniques like Nyström approximation to reduce computational complexity and sequential testing for adaptive sample size control. These advancements are impacting various fields, including goodness-of-fit testing, Bayesian inference, and model-based reinforcement learning, by enabling more efficient and accurate analysis of complex probability distributions. Furthermore, KSD is being leveraged to develop novel methods for model explanation and client selection in federated learning.