Neural Architecture Search
Neural Architecture Search (NAS) automates the design of optimal neural network architectures, aiming to replace the time-consuming and often suboptimal process of manual design. Current research focuses on improving efficiency, exploring various search algorithms (including reinforcement learning, evolutionary algorithms, and gradient-based methods), and developing effective zero-cost proxies to reduce computational demands. This field is significant because it promises to accelerate the development of high-performing models across diverse applications, from image recognition and natural language processing to resource-constrained environments like microcontrollers and in-memory computing.
Papers
FreeREA: Training-Free Evolution-based Architecture Search
Niccolò Cavagnero, Luca Robbiano, Barbara Caputo, Giuseppe Averta
DFG-NAS: Deep and Flexible Graph Neural Architecture Search
Wentao Zhang, Zheyu Lin, Yu Shen, Yang Li, Zhi Yang, Bin Cui
Neural Architecture Adaptation for Object Detection by Searching Channel Dimensions and Mapping Pre-trained Parameters
Harim Jung, Myeong-Seok Oh, Cheoljong Yang, Seong-Whan Lee
RT-DNAS: Real-time Constrained Differentiable Neural Architecture Search for 3D Cardiac Cine MRI Segmentation
Qing Lu, Xiaowei Xu, Shunjie Dong, Cong Hao, Lei Yang, Cheng Zhuo, Yiyu Shi
Towards Self-supervised and Weight-preserving Neural Architecture Search
Zhuowei Li, Yibo Gao, Zhenzhou Zha, Zhiqiang HU, Qing Xia, Shaoting Zhang, Dimitris N. Metaxas