Zero Shot Neural Architecture Search

Zero-shot neural architecture search (NAS) aims to automate the design of efficient neural networks without the computationally expensive training typically required by traditional NAS methods. Current research focuses on developing accurate and generalizable "zero-cost proxies"—performance prediction models that estimate a network's accuracy based on its architecture alone, often employing techniques like symbolic equation modeling, graph convolutional networks, and transformer-based embeddings. These advancements significantly reduce the search time and resource consumption of NAS, enabling the discovery of optimized architectures for various tasks and hardware platforms, particularly for resource-constrained devices like microcontrollers. This accelerates the development of efficient and effective deep learning models across diverse applications.

Papers