Generative Foundation Model
Generative foundation models are large-scale, pre-trained models capable of generating diverse data types, including text, images, audio, and tabular data, aiming to create realistic and controllable synthetic data across various domains. Current research focuses on adapting these models to specific applications, such as financial market simulation, sleep staging, and 3D mesh generation, often employing diffusion models and transformers. These models offer significant potential for accelerating scientific discovery, particularly in areas with limited data, by generating synthetic training data for downstream tasks and enabling efficient exploration of complex systems.
Papers
October 17, 2024
October 10, 2024
October 8, 2024
September 4, 2024
August 9, 2024
June 25, 2024
June 13, 2024
June 7, 2024
May 31, 2024
May 22, 2024
May 20, 2024
May 16, 2024
May 6, 2024
March 18, 2024
March 5, 2024
December 6, 2023
October 18, 2023
May 18, 2023
April 13, 2023
December 29, 2022