Step Size
Step size, a crucial parameter in iterative optimization algorithms, determines the magnitude of updates towards a solution. Current research focuses on developing adaptive step size strategies, moving away from manually tuned or fixed values, across various algorithms including stochastic gradient descent (SGD), Adam, and evolution strategies. These adaptive methods aim to improve convergence speed, robustness, and generalization performance in diverse applications such as machine learning, federated learning, and deep learning model training. The development of efficient and theoretically sound adaptive step size techniques is vital for advancing optimization methods and improving the performance of numerous machine learning models.
Papers
August 1, 2022
June 29, 2022
June 19, 2022
June 6, 2022
May 16, 2022
April 1, 2022
March 24, 2022
March 4, 2022
February 24, 2022
February 20, 2022
February 17, 2022
January 31, 2022