Kernel Stein Discrepancy

Kernel Stein Discrepancy (KSD) is a powerful statistical tool used to measure the difference between a probability distribution and an approximation, particularly useful when the true distribution is unnormalized or complex. Current research focuses on improving the efficiency of KSD-based methods, particularly for high-dimensional data, through techniques like Nyström approximation and aggregated kernel approaches, as well as developing robust KSD tests less sensitive to model misspecification. These advancements are significant for applications such as goodness-of-fit testing, Bayesian neural network evaluation, and improving the convergence rates of algorithms like Stein Variational Gradient Descent (SVGD), impacting various fields relying on accurate probabilistic modeling.

Papers