Architecture Search
Architecture search (NAS) automates the design of optimal neural network architectures, aiming to improve model performance and efficiency for various tasks. Current research focuses on developing more efficient search algorithms, including those leveraging zero-cost proxies, large language models, and reinforcement learning, and exploring joint optimization of architecture and hardware parameters for specific deployment environments (e.g., MCUs, edge devices). These advancements are significant because they accelerate the development of high-performing, resource-efficient models across diverse applications, from computer vision and natural language processing to recommendation systems and quantum machine learning.
Papers
December 7, 2021
November 26, 2021