Second Order Algorithm

Second-order algorithms leverage curvature information (Hessian matrix) to accelerate optimization, aiming for faster convergence compared to first-order methods that only use gradients. Current research focuses on developing efficient and robust second-order methods for various challenging problems, including minimax optimization, zero-sum games, and large-scale machine learning tasks, often employing techniques like cubic regularization, Hessian sketching, and adaptive step sizes. These advancements are significant because they enable faster training of complex models and improved solution quality in diverse applications, ranging from deep learning to scientific machine learning and differentially private optimization.

Papers