Superior Optimizer
Superior optimizer research focuses on developing algorithms that efficiently and effectively solve complex optimization problems, particularly within machine learning and related fields. Current efforts concentrate on improving existing methods like Adam and SGD, exploring novel approaches such as Hessian-informed optimizers and learned optimizers based on transformer architectures, and addressing challenges like the optimizer's curse and efficient distributed training. These advancements have significant implications for training large-scale models, accelerating scientific discovery, and optimizing resource-intensive applications across various domains.
Papers
May 25, 2023
May 8, 2023
March 23, 2023
February 7, 2023
January 9, 2023
December 2, 2022
October 12, 2022
September 27, 2022
May 16, 2022
April 6, 2022
April 5, 2022
March 24, 2022
March 22, 2022