Hyperparameter Optimisation
Hyperparameter optimization (HPO) focuses on efficiently finding the optimal settings for the parameters that control machine learning model training, impacting performance significantly. Current research emphasizes developing more efficient HPO methods, particularly for large datasets and continual learning scenarios, exploring techniques like metaheuristics, Bayesian optimization, and novel algorithms tailored to specific model architectures (e.g., Gaussian processes, neural networks, and reservoir computing). These advancements are crucial for improving the scalability and reliability of machine learning across diverse applications, from economic forecasting to scientific discovery.
Papers
July 15, 2024
May 28, 2024
May 16, 2024
April 9, 2024
February 8, 2024
July 17, 2023
June 29, 2023
May 25, 2023
March 9, 2023