Superior Optimizer
Superior optimizer research focuses on developing algorithms that efficiently and effectively solve complex optimization problems, particularly within machine learning and related fields. Current efforts concentrate on improving existing methods like Adam and SGD, exploring novel approaches such as Hessian-informed optimizers and learned optimizers based on transformer architectures, and addressing challenges like the optimizer's curse and efficient distributed training. These advancements have significant implications for training large-scale models, accelerating scientific discovery, and optimizing resource-intensive applications across various domains.
Papers
August 21, 2024
July 10, 2024
July 3, 2024
June 23, 2024
June 17, 2024
June 3, 2024
May 15, 2024
March 30, 2024
December 31, 2023
November 1, 2023
October 27, 2023
July 18, 2023
July 15, 2023
July 14, 2023
July 2, 2023
June 30, 2023
June 24, 2023
June 16, 2023
June 2, 2023