Score Distillation
Score distillation leverages pre-trained diffusion models to efficiently generate new data, primarily images and 3D models, by transferring the learned "score" function (representing the probability density) to a smaller, faster student model. Current research focuses on improving the speed and quality of this transfer, addressing issues like mode collapse, over-smoothing, and view inconsistencies, often through novel loss functions and optimization strategies within frameworks like variational score distillation and denoising diffusion implicit models. This technique is significant for accelerating generative AI and enabling applications such as text-to-3D generation, image editing, and solving inverse problems in various fields, particularly where training data is scarce.
Papers
Rethinking Score Distillation as a Bridge Between Image Distributions
David McAllister, Songwei Ge, Jia-Bin Huang, David W. Jacobs, Alexei A. Efros, Aleksander Holynski, Angjoo Kanazawa
Preserving Identity with Variational Score for General-purpose 3D Editing
Duong H. Le, Tuan Pham, Aniruddha Kembhavi, Stephan Mandt, Wei-Chiu Ma, Jiasen Lu