Wasserstein Gradient Flow
Wasserstein gradient flow (WGF) describes the optimization of probability distributions by moving them along paths of minimal "distance" (measured by the Wasserstein metric), minimizing a chosen objective function. Current research focuses on developing efficient algorithms, such as the JKO scheme and its variants, and neural network-based approaches like Sinkhorn flows, to approximate WGF for various applications. These advancements are impacting diverse fields, including generative modeling, inverse problems, and causal inference, by providing principled methods for optimizing over probability distributions and improving the accuracy and scalability of existing techniques.
Papers
October 4, 2023
July 4, 2023
May 24, 2023
April 8, 2023
February 9, 2023
January 31, 2023
January 9, 2023
December 29, 2022
November 30, 2022
November 15, 2022
October 25, 2022
June 29, 2022
June 4, 2022
February 15, 2022
February 9, 2022
December 27, 2021