Hyperparameter Configuration
Hyperparameter configuration, the process of selecting optimal settings for machine learning algorithms, aims to maximize model performance and generalization. Current research focuses on improving efficiency and robustness of hyperparameter optimization (HPO) methods, exploring techniques like Bayesian optimization, multi-fidelity approaches (e.g., Hyperband), and ensemble methods (e.g., stacking with boosting), often incorporating surrogate models or meta-learning. Effective HPO is crucial for achieving state-of-the-art results in various machine learning applications, impacting both the efficiency of model development and the ultimate performance of deployed systems.
Papers
September 24, 2024
July 26, 2024
July 10, 2024
May 24, 2024
February 2, 2024
December 29, 2023
December 11, 2023
October 25, 2023
June 11, 2023
June 6, 2023
May 28, 2023
April 28, 2023
April 25, 2023
April 11, 2023
March 27, 2023
February 1, 2023
November 22, 2022
September 29, 2022
July 14, 2022