Neural Architecture Search
Neural Architecture Search (NAS) automates the design of optimal neural network architectures, aiming to replace the time-consuming and often suboptimal process of manual design. Current research focuses on improving efficiency, exploring various search algorithms (including reinforcement learning, evolutionary algorithms, and gradient-based methods), and developing effective zero-cost proxies to reduce computational demands. This field is significant because it promises to accelerate the development of high-performing models across diverse applications, from image recognition and natural language processing to resource-constrained environments like microcontrollers and in-memory computing.
Papers
Simultaneous Weight and Architecture Optimization for Neural Networks
Zitong Huang, Mansooreh Montazerin, Ajitesh Srivastava
Neural Architecture Search of Hybrid Models for NPU-CIM Heterogeneous AR/VR Devices
Yiwei Zhao, Ziyun Li, Win-San Khwa, Xiaoyu Sun, Sai Qian Zhang, Syed Shakib Sarwar, Kleber Hugo Stangherlin, Yi-Lun Lu, Jorge Tomas Gomez, Jae-Sun Seo, Phillip B. Gibbons, Barbara De Salvo, Chiao Liu
Designing a Classifier for Active Fire Detection from Multispectral Satellite Imagery Using Neural Architecture Search
Amber Cassimon, Phil Reiter, Siegfried Mercelis, Kevin Mets
LPZero: Language Model Zero-cost Proxy Search from Zero
Peijie Dong, Lujun Li, Xiang Liu, Zhenheng Tang, Xuebo Liu, Qiang Wang, Xiaowen Chu
Cartesian Genetic Programming Approach for Designing Convolutional Neural Networks
Maciej Krzywda, Szymon Łukasik, Amir Gandomi H
POMONAG: Pareto-Optimal Many-Objective Neural Architecture Generator
Eugenio Lomurno, Samuele Mariani, Matteo Monti, Matteo Matteucci
Lightweight Neural Architecture Search for Cerebral Palsy Detection
Felix Tempel, Espen Alexander F. Ihlen, Inga Strümke