Neural Architecture
Neural architecture research focuses on designing and optimizing the structure of artificial neural networks to improve efficiency, accuracy, and interpretability. Current efforts concentrate on developing novel architectures like Kolmogorov-Arnold Networks and transformers, employing efficient search algorithms (e.g., evolutionary algorithms, generative flows) to explore vast design spaces, and analyzing the representational similarity and training efficiency of different models. These advancements are crucial for deploying deep learning in resource-constrained environments and for gaining a deeper understanding of how neural networks learn and generalize, impacting fields ranging from computer vision and natural language processing to scientific computing and edge devices.
Papers
Comgra: A Tool for Analyzing and Debugging Neural Networks
Florian Dietz, Sophie Fellenz, Dietrich Klakow, Marius Kloft
TinyChirp: Bird Song Recognition Using TinyML Models on Low-power Wireless Acoustic Sensors
Zhaolan Huang, Adrien Tousnakhoff, Polina Kozyr, Roman Rehausen, Felix Bießmann, Robert Lachlan, Cedric Adjih, Emmanuel Baccelli
Emergence in non-neural models: grokking modular arithmetic via average gradient outer product
Neil Mallinar, Daniel Beaglehole, Libin Zhu, Adityanarayanan Radhakrishnan, Parthe Pandit, Mikhail Belkin
SalNAS: Efficient Saliency-prediction Neural Architecture Search with self-knowledge distillation
Chakkrit Termritthikun, Ayaz Umer, Suwichaya Suwanwimolkul, Feng Xia, Ivan Lee