Neural Network
Neural networks are computational models inspired by the structure and function of the brain, primarily aimed at approximating complex functions and solving diverse problems through learning from data. Current research emphasizes improving efficiency and robustness, exploring novel architectures like sinusoidal neural fields and hybrid models combining neural networks with radial basis functions, as well as developing methods for understanding and manipulating the internal representations learned by these networks, such as through hyper-representations of network weights. These advancements are driving progress in various fields, including computer vision, natural language processing, and scientific modeling, by enabling more accurate, efficient, and interpretable AI systems.
Papers
Highly Accurate Real-space Electron Densities with Neural Networks
Lixue Cheng, P. Bernát Szabó, Zeno Schätzle, Derk Kooi, Jonas Köhler, Klaas J. H. Giesbertz, Frank Noé, Jan Hermann, Paola Gori-Giorgi, Adam Foster
CoLaNET -- A Spiking Neural Network with Columnar Layered Architecture for Classification
Mikhail Kiselev
An Efficient General-Purpose Optical Accelerator for Neural Networks
Sijie Fei, Amro Eldebiky, Grace Li Zhang, Bing Li, Ulf Schlichtmann
Forecasting infectious disease prevalence with associated uncertainty using neural networks
Michael Morris
Learning Robust Representations for Communications over Noisy Channels
Sudharsan Senthil, Shubham Paul, Nambi Seshadri, R. David Koilpillai
On the optimal approximation of Sobolev and Besov functions using deep ReLU neural networks
Yunfei Yang
Improving Adaptivity via Over-Parameterization in Sequence Models
Yicheng Li, Qian Lin
Physics-Informed Neural Networks and Extensions
Maziar Raissi, Paris Perdikaris, Nazanin Ahmadi, George Em Karniadakis
Maelstrom Networks
Matthew Evanusa, Cornelia Fermüller, Yiannis Aloimonos
ElasticAI: Creating and Deploying Energy-Efficient Deep Learning Accelerator for Pervasive Computing
Chao Qian, Tianheng Ling, Gregor Schiele
Addressing Common Misinterpretations of KART and UAT in Neural Network Literature
Vugar Ismailov
Reconsidering the energy efficiency of spiking neural networks
Zhanglu Yan, Zhenyu Bai, Weng-Fai Wong
Convolutional Neural Network Compression Based on Low-Rank Decomposition
Yaping He, Linhao Jiang, Di Wu
ART: Actually Robust Training
Sebastian Chwilczyński, Kacper Trębacz, Karol Cyganik, Mateusz Małecki, Dariusz Brzezinski
Machine Learning of Nonlinear Dynamical Systems with Control Parameters Using Feedforward Neural Networks
Hidetsugu Sakaguchi
Deep Learning to Predict Late-Onset Breast Cancer Metastasis: the Single Hyperparameter Grid Search (SHGS) Strategy for Meta Tuning Concerning Deep Feed-forward Neural Network
Yijun Zhou, Om Arora-Jain, Xia Jiang
An Artificial Neural Network for Image Classification Inspired by Aversive Olfactory Learning Circuits in Caenorhabditis Elegans
Xuebin Wang, Chunxiuzi Liu, Meng Zhao, Ke Zhang, Zengru Di, He Liu