Score Distillation
Score distillation leverages pre-trained diffusion models to efficiently generate new data, primarily images and 3D models, by transferring the learned "score" function (representing the probability density) to a smaller, faster student model. Current research focuses on improving the speed and quality of this transfer, addressing issues like mode collapse, over-smoothing, and view inconsistencies, often through novel loss functions and optimization strategies within frameworks like variational score distillation and denoising diffusion implicit models. This technique is significant for accelerating generative AI and enabling applications such as text-to-3D generation, image editing, and solving inverse problems in various fields, particularly where training data is scarce.
Papers
SteinDreamer: Variance Reduction for Text-to-3D Score Distillation via Stein Identity
Peihao Wang, Zhiwen Fan, Dejia Xu, Dilin Wang, Sreyas Mohan, Forrest Iandola, Rakesh Ranjan, Yilei Li, Qiang Liu, Zhangyang Wang, Vikas Chandra
Taming Mode Collapse in Score Distillation for Text-to-3D Generation
Peihao Wang, Dejia Xu, Zhiwen Fan, Dilin Wang, Sreyas Mohan, Forrest Iandola, Rakesh Ranjan, Yilei Li, Qiang Liu, Zhangyang Wang, Vikas Chandra