Generative Modeling
Generative modeling aims to create new data instances that resemble a given dataset, focusing on learning the underlying probability distribution. Current research emphasizes hybrid approaches combining the strengths of autoregressive models (for global context) and diffusion models (for high-quality local details), as well as advancements in flow-based models and score-based methods. These techniques are significantly impacting diverse fields, including image generation, 3D modeling, time series forecasting, and even scientific applications like molecular dynamics simulation and medical image synthesis, by enabling the creation of realistic and diverse synthetic data.
Papers
Don't be so negative! Score-based Generative Modeling with Oracle-assisted Guidance
Saeid Naderiparizi, Xiaoxuan Liang, Berend Zwartsenberg, Frank Wood
Statistically Optimal Generative Modeling with Maximum Deviation from the Empirical Distribution
Elen Vardanyan, Sona Hunanyan, Tigran Galstyan, Arshak Minasyan, Arnak Dalalyan
A Survey on Generative Modeling with Limited Data, Few Shots, and Zero Shot
Milad Abdollahzadeh, Touba Malekzadeh, Christopher T. H. Teo, Keshigeyan Chandrasegaran, Guimeng Liu, Ngai-Man Cheung
Pre-Training with Diffusion models for Dental Radiography segmentation
Jérémy Rousseau, Christian Alaka, Emma Covili, Hippolyte Mayard, Laura Misrachi, Willy Au
QIGen: Generating Efficient Kernels for Quantized Inference on Large Language Models
Tommaso Pegolotti, Elias Frantar, Dan Alistarh, Markus Püschel
Simulation-free Schr\"odinger bridges via score and flow matching
Alexander Tong, Nikolay Malkin, Kilian Fatras, Lazar Atanackovic, Yanlei Zhang, Guillaume Huguet, Guy Wolf, Yoshua Bengio
Training Energy-Based Models with Diffusion Contrastive Divergences
Weijian Luo, Hao Jiang, Tianyang Hu, Jiacheng Sun, Zhenguo Li, Zhihua Zhang
Learning Lie Group Symmetry Transformations with Neural Networks
Alex Gabel, Victoria Klein, Riccardo Valperga, Jeroen S. W. Lamb, Kevin Webster, Rick Quax, Efstratios Gavves