Neural Architecture Search
Neural Architecture Search (NAS) automates the design of optimal neural network architectures, aiming to replace the time-consuming and often suboptimal process of manual design. Current research focuses on improving efficiency, exploring various search algorithms (including reinforcement learning, evolutionary algorithms, and gradient-based methods), and developing effective zero-cost proxies to reduce computational demands. This field is significant because it promises to accelerate the development of high-performing models across diverse applications, from image recognition and natural language processing to resource-constrained environments like microcontrollers and in-memory computing.
Papers
Online Evolutionary Neural Architecture Search for Multivariate Non-Stationary Time Series Forecasting
Zimeng Lyu, Alexander Ororbia, Travis Desell
Multiobjective Evolutionary Pruning of Deep Neural Networks with Transfer Learning for improving their Performance and Robustness
Javier Poyatos, Daniel Molina, Aitor MartÃnez, Javier Del Ser, Francisco Herrera
Quantifying uncertainty for deep learning based forecasting and flow-reconstruction using neural architecture search ensembles
Romit Maulik, Romain Egele, Krishnan Raghavan, Prasanna Balaprakash