Score Based Generative
Score-based generative models (SBMs) are a class of deep generative models that learn the data distribution by modeling the score function—the gradient of the log-probability density—of a perturbed data distribution. Current research focuses on improving SBM efficiency and robustness through novel training objectives, algorithms like Langevin dynamics and probability flow ODEs, and architectural innovations such as incorporating symmetries and handling high-cardinality data. SBMs are proving valuable across diverse fields, from image generation and speech enhancement to solving inverse problems in medical imaging and astrophysics, offering a powerful tool for both data generation and inference tasks.
Papers
Bellman Diffusion: Generative Modeling as Learning a Linear Operator in the Distribution Space
Yangming Li, Chieh-Hsin Lai, Carola-Bibiane Schönlieb, Yuki Mitsufuji, Stefano Ermon
Equivariant score-based generative models provably learn distributions with symmetries efficiently
Ziyu Chen, Markos A. Katsoulakis, Benjamin J. Zhang
Score-based generative models are provably robust: an uncertainty quantification perspective
Nikiforos Mimikos-Stamatopoulos, Benjamin J. Zhang, Markos A. Katsoulakis
Nonlinear denoising score matching for enhanced learning of structured distributions
Jeremiah Birrell, Markos A. Katsoulakis, Luc Rey-Bellet, Benjamin Zhang, Wei Zhu