Optimization Algorithm
Optimization algorithms aim to find the best solution within a given search space, a fundamental problem across numerous scientific and engineering disciplines. Current research emphasizes improving convergence rates and efficiency, particularly in distributed settings and for high-dimensional problems, with a focus on algorithms like gradient descent and its variants (including second-order methods and adaptive optimizers), as well as meta-learning approaches and derivative-free methods. These advancements are crucial for tackling increasingly complex problems in machine learning, wireless systems, and other fields where efficient and robust optimization is paramount.
Papers
GNBG: A Generalized and Configurable Benchmark Generator for Continuous Numerical Optimization
Danial Yazdani, Mohammad Nabi Omidvar, Delaram Yazdani, Kalyanmoy Deb, Amir H. Gandomi
GNBG-Generated Test Suite for Box-Constrained Numerical Global Optimization
Amir H. Gandomi, Danial Yazdani, Mohammad Nabi Omidvar, Kalyanmoy Deb