Second Order
Second-order methods in machine learning leverage curvature information, primarily through Hessian matrices or their approximations, to improve optimization efficiency and model performance compared to first-order methods. Current research focuses on developing computationally tractable second-order algorithms, such as those employing diagonal Hessian approximations or low-rank matrix factorizations, for training large-scale models like LLMs and improving reinforcement learning. These advancements are significant because they offer faster convergence, enhanced generalization, and improved robustness in various applications, including image classification, natural language processing, and robotics.
Papers
May 23, 2023
May 22, 2023
May 1, 2023
April 20, 2023
April 17, 2023
April 15, 2023
March 22, 2023
February 23, 2023
February 17, 2023
February 16, 2023
February 10, 2023
February 9, 2023
January 30, 2023
January 28, 2023
January 19, 2023
December 1, 2022
November 28, 2022