Newton Method
Newton's method, a powerful iterative technique for finding roots or optimizing functions, remains a central focus in optimization research, particularly within machine learning and scientific computing. Current research emphasizes improving its efficiency and robustness, especially for high-dimensional problems, through techniques like cubic regularization, variance reduction, and subspace methods (e.g., Krylov subspace methods), as well as addressing challenges in federated learning and privacy. These advancements enhance the applicability of Newton-type methods to large-scale problems, impacting fields ranging from neural network training and PDE solving to multi-objective optimization and robust statistics.
Papers
October 25, 2024
October 18, 2024
August 8, 2024
July 29, 2024
July 1, 2024
June 20, 2024
May 23, 2024
May 9, 2024
April 23, 2024
March 22, 2024
January 23, 2024
January 5, 2024
January 2, 2024
November 10, 2023
October 23, 2023
September 5, 2023
August 28, 2023
May 26, 2023
May 22, 2023