Hyper Parameter
Hyperparameters are the settings of a machine learning model that are not learned from data but are set beforehand, significantly impacting model performance and resource consumption. Current research focuses on optimizing hyperparameter selection across various model architectures, including deep neural networks, large language models, and Gaussian processes, often employing techniques like Bayesian optimization, evolutionary algorithms, and novel mathematical frameworks to improve efficiency and generalization. Effective hyperparameter tuning is crucial for achieving optimal model performance, reducing computational costs (including energy consumption), and enhancing the reliability and reproducibility of machine learning results across diverse applications.
Papers
An Improved Model Ensembled of Different Hyper-parameter Tuned Machine Learning Algorithms for Fetal Health Prediction
Md. Simul Hasan Talukder, Sharmin Akter
Automatic Tuning of Loss Trade-offs without Hyper-parameter Search in End-to-End Zero-Shot Speech Synthesis
Seongyeon Park, Bohyung Kim, Tae-hyun Oh
Analyzing the Impact of Varied Window Hyper-parameters on Deep CNN for sEMG based Motion Intent Classification
Frank Kulwa, Oluwarotimi Williams Samuel, Mojisola Grace Asogbon, Olumide Olayinka Obe, Guanglin Li
Pareto Driven Surrogate (ParDen-Sur) Assisted Optimisation of Multi-period Portfolio Backtest Simulations
Terence L. van Zyl, Matthew Woolway, Andrew Paskaramoorthy