Neural Architecture
Neural architecture research focuses on designing and optimizing the structure of artificial neural networks to improve efficiency, accuracy, and interpretability. Current efforts concentrate on developing novel architectures like Kolmogorov-Arnold Networks and transformers, employing efficient search algorithms (e.g., evolutionary algorithms, generative flows) to explore vast design spaces, and analyzing the representational similarity and training efficiency of different models. These advancements are crucial for deploying deep learning in resource-constrained environments and for gaining a deeper understanding of how neural networks learn and generalize, impacting fields ranging from computer vision and natural language processing to scientific computing and edge devices.
Papers
The Neural Race Reduction: Dynamics of Abstraction in Gated Networks
Andrew M. Saxe, Shagun Sodhani, Sam Lewallen
Auto Machine Learning for Medical Image Analysis by Unifying the Search on Data Augmentation and Neural Architecture
Jianwei Zhang, Dong Li, Lituan Wang, Lei Zhang
Efficient Search of Multiple Neural Architectures with Different Complexities via Importance Sampling
Yuhei Noda, Shota Saito, Shinichi Shirakawa