Derivative Free Optimization
Derivative-free optimization (DFO) tackles the challenge of finding optimal solutions for functions where derivatives are unavailable or computationally expensive to obtain. Current research focuses on developing efficient algorithms, such as Bayesian optimization, ensemble Kalman methods, and novel metaheuristics (e.g., Gaussian Crunching Search), to address various problem types, including high-dimensional and non-convex scenarios. DFO's significance lies in its broad applicability across diverse fields, enabling efficient optimization in settings ranging from scientific modeling (e.g., fluid dynamics, materials science) to machine learning (e.g., hyperparameter tuning, personalized LLMs) where gradient-based methods are impractical. The development of robust and scalable DFO methods is crucial for advancing research and practical applications in these areas.