Step Size
Step size, a crucial parameter in iterative optimization algorithms, determines the magnitude of updates towards a solution. Current research focuses on developing adaptive step size strategies, moving away from manually tuned or fixed values, across various algorithms including stochastic gradient descent (SGD), Adam, and evolution strategies. These adaptive methods aim to improve convergence speed, robustness, and generalization performance in diverse applications such as machine learning, federated learning, and deep learning model training. The development of efficient and theoretically sound adaptive step size techniques is vital for advancing optimization methods and improving the performance of numerous machine learning models.
Papers
November 18, 2024
November 11, 2024
November 1, 2024
October 28, 2024
October 14, 2024
October 1, 2024
September 29, 2024
August 30, 2024
July 31, 2024
July 15, 2024
July 4, 2024
June 7, 2024
May 27, 2024
April 9, 2024
April 1, 2024
February 7, 2024
January 30, 2024
January 24, 2024
December 18, 2023