Hyper Parameter Optimization
Hyper-parameter optimization (HPO) aims to efficiently find the best settings for a machine learning model's parameters, maximizing performance and minimizing computational cost. Current research focuses on developing more efficient algorithms, including Bayesian optimization, reinforcement learning, and evolutionary methods, often integrated with novel architectures like transformers. These advancements are crucial for improving the scalability and performance of machine learning across diverse applications, from large language models to resource-constrained IoT devices, and are driving the development of automated machine learning tools.
Papers
October 3, 2022
September 10, 2022
May 13, 2022
March 15, 2022
February 16, 2022
February 15, 2022
January 22, 2022
January 17, 2022
January 4, 2022
December 15, 2021
December 8, 2021
December 3, 2021