Stein Variational Gradient Descent
Stein Variational Gradient Descent (SVGD) is a particle-based algorithm for approximating probability distributions, aiming to efficiently sample from complex, high-dimensional targets without relying on Markov Chain Monte Carlo methods. Current research focuses on improving SVGD's convergence speed and accuracy, particularly in finite-particle settings, through techniques like deep unfolding, noise injection, and incorporating importance weights or constraints. These advancements are impacting various fields, including Bayesian inference, reinforcement learning, and robotics, by enabling more efficient and robust solutions to challenging inference and optimization problems.
Papers
November 7, 2024
November 4, 2024
October 30, 2024
October 21, 2024
October 14, 2024
September 13, 2024
June 17, 2024
May 2, 2024
April 16, 2024
March 8, 2024
February 23, 2024
February 2, 2024
November 28, 2023
October 3, 2023
September 20, 2023
August 23, 2023
May 27, 2023
May 23, 2023
May 18, 2023