Paper ID: 2410.19706
Super Gradient Descent: Global Optimization requires Global Gradient
Seifeddine Achour
Global minimization is a fundamental challenge in optimization, especially in machine learning, where finding the global minimum of a function directly impacts model performance and convergence. This report introduces a novel optimization method that we called Super Gradient Descent, designed specifically for one-dimensional functions, guaranteeing convergence to the global minimum for any k-Lipschitz function defined on a closed interval [a, b]. Our approach addresses the limitations of traditional optimization algorithms, which often get trapped in local minima. In particular, we introduce the concept of global gradient which offers a robust solution for precise and well-guided global optimization. By focusing on the global minimization problem, this work bridges a critical gap in optimization theory, offering new insights and practical advancements in different optimization problems in particular Machine Learning problems like line search.
Submitted: Oct 25, 2024