Step Size
Step size, a crucial parameter in iterative optimization algorithms, determines the magnitude of updates towards a solution. Current research focuses on developing adaptive step size strategies, moving away from manually tuned or fixed values, across various algorithms including stochastic gradient descent (SGD), Adam, and evolution strategies. These adaptive methods aim to improve convergence speed, robustness, and generalization performance in diverse applications such as machine learning, federated learning, and deep learning model training. The development of efficient and theoretically sound adaptive step size techniques is vital for advancing optimization methods and improving the performance of numerous machine learning models.
Papers
November 28, 2023
November 23, 2023
October 12, 2023
October 9, 2023
October 2, 2023
September 15, 2023
September 3, 2023
August 19, 2023
August 11, 2023
July 29, 2023
July 26, 2023
June 19, 2023
May 30, 2023
May 23, 2023
May 17, 2023
February 10, 2023
February 8, 2023
February 7, 2023
January 27, 2023