Mirror Diffusion Model
Mirror diffusion models (MDMs) address the challenge of generating data that adheres to specific constraints, a limitation of standard diffusion models. Research focuses on developing efficient algorithms, such as neural approximate mirror maps, to learn transformations that map constrained data to an unconstrained space for training, then back to the constrained space for generation. This approach improves the accuracy and reliability of generative models across diverse applications, including quantum state generation, image translation, and solving inverse problems. The resulting models offer enhanced control and fidelity in generating synthetic data while maintaining the advantages of diffusion-based generative modeling.
Papers
June 18, 2024
April 9, 2024
January 6, 2024
October 2, 2023
August 11, 2023
June 27, 2023