Paper ID: 2302.11406
Enhancing Machine Learning Model Performance with Hyper Parameter Optimization: A Comparative Study
Caner Erden, Halil Ibrahim Demir, Abdullah Hulusi Kökçam
One of the most critical issues in machine learning is the selection of appropriate hyper parameters for training models. Machine learning models may be able to reach the best training performance and may increase the ability to generalize using hyper parameter optimization (HPO) techniques. HPO is a popular topic that artificial intelligence studies have focused on recently and has attracted increasing interest. While the traditional methods developed for HPO include exhaustive search, grid search, random search, and Bayesian optimization; meta-heuristic algorithms are also employed as more advanced methods. Meta-heuristic algorithms search for the solution space where the solutions converge to the best combination to solve a specific problem. These algorithms test various scenarios and evaluate the results to select the best-performing combinations. In this study, classical methods, such as grid, random search and Bayesian optimization, and population-based algorithms, such as genetic algorithms and particle swarm optimization, are discussed in terms of the HPO. The use of related search algorithms is explained together with Python programming codes developed on packages such as Scikit-learn, Sklearn Genetic, and Optuna. The performance of the search algorithms is compared on a sample data set, and according to the results, the particle swarm optimization algorithm has outperformed the other algorithms.
Submitted: Feb 14, 2023