Step Size

Step size, a crucial parameter in iterative optimization algorithms, determines the magnitude of updates towards a solution. Current research focuses on developing adaptive step size strategies, moving away from manually tuned or fixed values, across various algorithms including stochastic gradient descent (SGD), Adam, and evolution strategies. These adaptive methods aim to improve convergence speed, robustness, and generalization performance in diverse applications such as machine learning, federated learning, and deep learning model training. The development of efficient and theoretically sound adaptive step size techniques is vital for advancing optimization methods and improving the performance of numerous machine learning models.

Papers