Normalizing Flow
Normalizing flows are a class of generative models that learn complex probability distributions by transforming a simple base distribution through a series of invertible transformations. Current research focuses on improving the efficiency and scalability of these flows, particularly for high-dimensional and multi-modal data, with advancements in architectures like continuous normalizing flows and the development of novel training algorithms such as flow matching. These models find applications across diverse fields, including image generation, Bayesian inference, and scientific modeling, offering advantages in density estimation, sampling, and uncertainty quantification.
Papers
Density-based Feasibility Learning with Normalizing Flows for Introspective Robotic Assembly
Jianxiang Feng, Matan Atad, Ismael Rodríguez, Maximilian Durner, Stephan Günnemann, Rudolph Triebel
Sampling the lattice Nambu-Goto string using Continuous Normalizing Flows
Michele Caselle, Elia Cellini, Alessandro Nada