Neural Network
Neural networks are computational models inspired by the structure and function of the brain, primarily aimed at approximating complex functions and solving diverse problems through learning from data. Current research emphasizes improving efficiency and robustness, exploring novel architectures like sinusoidal neural fields and hybrid models combining neural networks with radial basis functions, as well as developing methods for understanding and manipulating the internal representations learned by these networks, such as through hyper-representations of network weights. These advancements are driving progress in various fields, including computer vision, natural language processing, and scientific modeling, by enabling more accurate, efficient, and interpretable AI systems.
Papers
Graph neural networks and non-commuting operators
Mauricio Velasco, Kaiying O'Hare, Bernardo Rychtenberg, Soledad Villar
Weighted Sobolev Approximation Rates for Neural Networks on Unbounded Domains
Ahmed Abdeljawad, Thomas Dittrich
Problem Space Transformations for Generalisation in Behavioural Cloning
Kiran Doshi, Marco Bagatella, Stelian Coros
Non-Stationary Learning of Neural Networks with Automatic Soft Parameter Reset
Alexandre Galashov, Michalis K. Titsias, András György, Clare Lyle, Razvan Pascanu, Yee Whye Teh, Maneesh Sahani
Flexible task abstractions emerge in linear networks with fast and bounded units
Kai Sandbrink, Jan P. Bauer, Alexandra M. Proca, Andrew M. Saxe, Christopher Summerfield, Ali Hummos
Supervised Autoencoders with Fractionally Differentiated Features and Triple Barrier Labelling Enhance Predictions on Noisy Data
Bartosz Bieganowski, Robert Ślepaczuk
A Subsampling Based Neural Network for Spatial Data
Debjoy Thakur
Designing a Linearized Potential Function in Neural Network Optimization Using Csiszár Type of Tsallis Entropy
Keito Akiyama
Solving stochastic partial differential equations using neural networks in the Wiener chaos expansion
Ariel Neufeld, Philipp Schmocker
Neural Networks and (Virtual) Extended Formulations
Christoph Hertrich, Georg Loho
Confidence Calibration of Classifiers with Many Classes
Adrien Le Coz, Stéphane Herbin, Faouzi Adjed
Gradient Descent Finds Over-Parameterized Neural Networks with Sharp Generalization for Nonparametric Regression
Yingzhen Yang, Ping Li
Transferable polychromatic optical encoder for neural networks
Minho Choi, Jinlin Xiang, Anna Wirth-Singh, Seung-Hwan Baek, Eli Shlizerman, Arka Majumdar
Multi-modal deformable image registration using untrained neural networks
Quang Luong Nhat Nguyen, Ruiming Cao, Laura Waller
Pretrained transformer efficiently learns low-dimensional target functions in-context
Kazusato Oko, Yujin Song, Taiji Suzuki, Denny Wu
CSP-Net: Common Spatial Pattern Empowered Neural Networks for EEG-Based Motor Imagery Classification
Xue Jiang, Lubin Meng, Xinru Chen, Yifan Xu, Dongrui Wu
Learning predictable and robust neural representations by straightening image sequences
Xueyan Niu, Cristina Savin, Eero P. Simoncelli
Entropy stable conservative flux form neural networks
Lizuo Liu, Tongtong Li, Anne Gelb, Yoonsang Lee