Neural Architecture Search
Neural Architecture Search (NAS) automates the design of optimal neural network architectures, aiming to replace the time-consuming and often suboptimal process of manual design. Current research focuses on improving efficiency, exploring various search algorithms (including reinforcement learning, evolutionary algorithms, and gradient-based methods), and developing effective zero-cost proxies to reduce computational demands. This field is significant because it promises to accelerate the development of high-performing models across diverse applications, from image recognition and natural language processing to resource-constrained environments like microcontrollers and in-memory computing.
Papers
SA-GNAS: Seed Architecture Expansion for Efficient Large-scale Graph Neural Architecture Search
Guanghui Zhu, Zipeng Ji, Jingyan Chen, Limin Wang, Chunfeng Yuan, Yihua Huang
ILASH: A Predictive Neural Architecture Search Framework for Multi-Task Applications
Md Hafizur Rahman, Md Mashfiq Rizvee, Sumaiya Shomaji, Prabuddha Chakraborty