Wasserstein Gradient Flow

Wasserstein gradient flow (WGF) describes the optimization of probability distributions by moving them along paths of minimal "distance" (measured by the Wasserstein metric), minimizing a chosen objective function. Current research focuses on developing efficient algorithms, such as the JKO scheme and its variants, and neural network-based approaches like Sinkhorn flows, to approximate WGF for various applications. These advancements are impacting diverse fields, including generative modeling, inverse problems, and causal inference, by providing principled methods for optimizing over probability distributions and improving the accuracy and scalability of existing techniques.

Papers