Second Order Algorithm
Second-order algorithms leverage curvature information (Hessian matrix) to accelerate optimization, aiming for faster convergence compared to first-order methods that only use gradients. Current research focuses on developing efficient and robust second-order methods for various challenging problems, including minimax optimization, zero-sum games, and large-scale machine learning tasks, often employing techniques like cubic regularization, Hessian sketching, and adaptive step sizes. These advancements are significant because they enable faster training of complex models and improved solution quality in diverse applications, ranging from deep learning to scientific machine learning and differentially private optimization.
Papers
July 4, 2024
June 14, 2024
June 5, 2024
June 4, 2024
March 18, 2024
November 16, 2023
October 23, 2023
August 11, 2023
August 4, 2023
May 22, 2023
January 28, 2023
December 17, 2022
December 1, 2022
December 8, 2021
November 29, 2021