Neural Architecture Search
Neural Architecture Search (NAS) automates the design of optimal neural network architectures, aiming to replace the time-consuming and often suboptimal process of manual design. Current research focuses on improving efficiency, exploring various search algorithms (including reinforcement learning, evolutionary algorithms, and gradient-based methods), and developing effective zero-cost proxies to reduce computational demands. This field is significant because it promises to accelerate the development of high-performing models across diverse applications, from image recognition and natural language processing to resource-constrained environments like microcontrollers and in-memory computing.
Papers
GraphPAS: Parallel Architecture Search for Graph Neural Networks
Jiamin Chen, Jianliang Gao, Yibo Chen, Oloulade Babatounde Moctard, Tengfei Lyu, Zhao Li
RSBNet: One-Shot Neural Architecture Search for A Backbone Network in Remote Sensing Image Recognition
Cheng Peng, Yangyang Li, Ronghua Shang, Licheng Jiao
A Multi-criteria Approach to Evolve Sparse Neural Architectures for Stock Market Forecasting
Faizal Hafiz, Jan Broekaert, Davide La Torre, Akshya Swain
Stacked BNAS: Rethinking Broad Convolutional Neural Network for Neural Architecture Search
Zixiang Ding, Yaran Chen, Nannan Li, Dongbin Zhao, C. L. Philip Chen