Generative Modeling
Generative modeling aims to create new data instances that resemble a given dataset, focusing on learning the underlying probability distribution. Current research emphasizes hybrid approaches combining the strengths of autoregressive models (for global context) and diffusion models (for high-quality local details), as well as advancements in flow-based models and score-based methods. These techniques are significantly impacting diverse fields, including image generation, 3D modeling, time series forecasting, and even scientific applications like molecular dynamics simulation and medical image synthesis, by enabling the creation of realistic and diverse synthetic data.
Papers
PedSleepMAE: Generative Model for Multimodal Pediatric Sleep Signals
Saurav R. Pandey, Aaqib Saeed, Harlin Lee
Wasserstein Flow Matching: Generative modeling over families of distributions
Doron Haviv, Aram-Alexandre Pooladian, Dana Pe'er, Brandon Amos
$α$-TCVAE: On the relationship between Disentanglement and Diversity
Cristian Meo, Louis Mahon, Anirudh Goyal, Justin Dauwels
Generator Matching: Generative modeling with arbitrary Markov processes
Peter Holderrieth, Marton Havasi, Jason Yim, Neta Shaul, Itai Gat, Tommi Jaakkola, Brian Karrer, Ricky T. Q. Chen, Yaron Lipman
Idempotent Unsupervised Representation Learning for Skeleton-Based Action Recognition
Lilang Lin, Lehong Wu, Jiahang Zhang, Jiaying Liu
Privacy without Noisy Gradients: Slicing Mechanism for Generative Model Training
Kristjan Greenewald, Yuancheng Yu, Hao Wang, Kai Xu
Utilizing Image Transforms and Diffusion Models for Generative Modeling of Short and Long Time Series
Ilan Naiman, Nimrod Berman, Itai Pemper, Idan Arbiv, Gal Fadlon, Omri Azencot
Flow Generator Matching
Zemin Huang, Zhengyang Geng, Weijian Luo, Guo-jun Qi