Gradient Method
Gradient methods are iterative optimization algorithms that aim to find the minimum (or maximum) of a function by repeatedly stepping in the direction of the negative (or positive) gradient. Current research focuses on improving the efficiency and robustness of these methods, particularly for non-convex problems arising in deep learning and other applications, exploring variations like stochastic gradient descent, adaptive methods (e.g., Adam), and incorporating second-order information or preconditioning techniques. These advancements are crucial for training complex models, enabling progress in fields like scientific machine learning and improving the performance of various machine learning tasks.
Papers
January 25, 2024
January 22, 2024
January 16, 2024
November 21, 2023
October 23, 2023
August 19, 2023
August 18, 2023
July 26, 2023
July 18, 2023
July 13, 2023
July 11, 2023
July 10, 2023
June 26, 2023
June 8, 2023
May 30, 2023
May 26, 2023
April 13, 2023
March 19, 2023
March 2, 2023