One Shot Neural Architecture Search

One-shot neural architecture search (NAS) aims to efficiently discover optimal neural network architectures by training a single "supernet" encompassing many potential architectures, thereby avoiding the computationally expensive process of training each candidate individually. Current research focuses on improving the accuracy of architecture prediction within the supernet, often through techniques like topological simplification, improved sampling strategies (e.g., importance sampling, probability shifting), and the incorporation of hardware constraints or quantization policies directly into the search process. These advancements significantly reduce the time and resources required for NAS, impacting various applications from resource-constrained edge devices to large language model compression, ultimately accelerating the development of high-performing, efficient neural networks.

Papers