Normalizing Flow
Normalizing flows are a class of generative models that learn complex probability distributions by transforming a simple base distribution through a series of invertible transformations. Current research focuses on improving the efficiency and scalability of these flows, particularly for high-dimensional and multi-modal data, with advancements in architectures like continuous normalizing flows and the development of novel training algorithms such as flow matching. These models find applications across diverse fields, including image generation, Bayesian inference, and scientific modeling, offering advantages in density estimation, sampling, and uncertainty quantification.
Papers
$\mathtt{emuflow}$: Normalising Flows for Joint Cosmological Analysis
Arrykrishna Mootoovaloo, Carlos García-García, David Alonso, Jaime Ruiz-Zapatero
VQ-Flow: Taming Normalizing Flows for Multi-Class Anomaly Detection via Hierarchical Vector Quantization
Yixuan Zhou, Xing Xu, Zhe Sun, Jingkuan Song, Andrzej Cichocki, Heng Tao Shen