Hyper Parameter Optimization
Hyper-parameter optimization (HPO) aims to efficiently find the best settings for a machine learning model's parameters, maximizing performance and minimizing computational cost. Current research focuses on developing more efficient algorithms, including Bayesian optimization, reinforcement learning, and evolutionary methods, often integrated with novel architectures like transformers. These advancements are crucial for improving the scalability and performance of machine learning across diverse applications, from large language models to resource-constrained IoT devices, and are driving the development of automated machine learning tools.
Papers
October 27, 2024
October 7, 2024
September 7, 2024
June 27, 2024
March 29, 2024
March 18, 2024
February 27, 2024
February 7, 2024
February 2, 2024
January 17, 2024
December 6, 2023
October 30, 2023
October 17, 2023
September 5, 2023
August 1, 2023
May 22, 2023
April 20, 2023
March 3, 2023
February 14, 2023