Acceleration Method
Acceleration methods aim to significantly speed up computationally intensive processes across diverse fields, from optimization algorithms and machine learning model training to scientific simulations and image generation. Current research focuses on developing and analyzing acceleration techniques tailored to specific architectures, such as diffusion transformers and large language models, as well as improving existing methods like stochastic gradient descent and Riemannian optimization. These advancements are crucial for enabling the practical application of complex models and simulations in areas like autonomous driving, materials science, and natural language processing, where computational efficiency is paramount.
Papers
July 1, 2024
June 4, 2024
June 3, 2024
April 18, 2024
March 24, 2024
February 27, 2024
February 2, 2024
November 19, 2023
November 10, 2023
May 25, 2023
March 6, 2023
December 1, 2022
October 22, 2022
August 13, 2022
June 17, 2022
April 16, 2022
February 10, 2022