Wasserstein Gradient Flow
Wasserstein gradient flow (WGF) describes the optimization of probability distributions by moving them along paths of minimal "distance" (measured by the Wasserstein metric), minimizing a chosen objective function. Current research focuses on developing efficient algorithms, such as the JKO scheme and its variants, and neural network-based approaches like Sinkhorn flows, to approximate WGF for various applications. These advancements are impacting diverse fields, including generative modeling, inverse problems, and causal inference, by providing principled methods for optimizing over probability distributions and improving the accuracy and scalability of existing techniques.
Papers
November 1, 2024
October 10, 2024
September 30, 2024
September 23, 2024
July 29, 2024
June 22, 2024
June 12, 2024
June 3, 2024
April 18, 2024
February 8, 2024
February 7, 2024
February 2, 2024
January 25, 2024
November 30, 2023
November 28, 2023
November 26, 2023
October 31, 2023
October 25, 2023