Diffusion Distillation
Diffusion distillation aims to accelerate and improve the efficiency of diffusion models, which are powerful but computationally expensive generative models. Current research focuses on distilling these models into faster, single-step or few-step generators using various techniques, including distribution matching, consistency distillation, and adversarial training, often leveraging architectures like GANs and DEQ models. This work is significant because it addresses the computational bottleneck of diffusion models, enabling their wider application in areas such as image-to-image translation, text-to-image generation, and video editing, while also improving the quality and robustness of generated outputs.
Papers
November 7, 2024
October 30, 2024
October 22, 2024
October 15, 2024
October 10, 2024
September 19, 2024
August 19, 2024
June 20, 2024
May 9, 2024
April 5, 2024
March 14, 2024
February 27, 2024
February 21, 2024
December 25, 2023
December 12, 2023
November 7, 2023
July 17, 2023