Paper ID: 2410.21886
Bayesian Optimization for Hyperparameters Tuning in Neural Networks
Gabriele Onorato
This study investigates the application of Bayesian Optimization (BO) for the hyperparameter tuning of neural networks, specifically targeting the enhancement of Convolutional Neural Networks (CNN) for image classification tasks. Bayesian Optimization is a derivative-free global optimization method suitable for expensive black-box functions with continuous inputs and limited evaluation budgets. The BO algorithm leverages Gaussian Process regression and acquisition functions like Upper Confidence Bound (UCB) and Expected Improvement (EI) to identify optimal configurations effectively. Using the Ax and BOTorch frameworks, this work demonstrates the efficiency of BO in reducing the number of hyperparameter tuning trials while achieving competitive model performance. Experimental outcomes reveal that BO effectively balances exploration and exploitation, converging rapidly towards optimal settings for CNN architectures. This approach underlines the potential of BO in automating neural network tuning, contributing to improved accuracy and computational efficiency in machine learning pipelines.
Submitted: Oct 29, 2024