# Neural Network

Neural networks are computational models inspired by the structure and function of the brain, primarily aimed at approximating complex functions and solving diverse problems through learning from data. Current research emphasizes improving efficiency and robustness, exploring novel architectures like sinusoidal neural fields and hybrid models combining neural networks with radial basis functions, as well as developing methods for understanding and manipulating the internal representations learned by these networks, such as through hyper-representations of network weights. These advancements are driving progress in various fields, including computer vision, natural language processing, and scientific modeling, by enabling more accurate, efficient, and interpretable AI systems.

## Papers

### Weighted Sobolev Approximation Rates for Neural Networks on Unbounded Domains

Ahmed Abdeljawad, Thomas Dittrich

### Problem Space Transformations for Generalisation in Behavioural Cloning

Kiran Doshi, Marco Bagatella, Stelian Coros

### Non-Stationary Learning of Neural Networks with Automatic Soft Parameter Reset

Alexandre Galashov, Michalis K. Titsias, András György, Clare Lyle, Razvan Pascanu, Yee Whye Teh, Maneesh Sahani

### Flexible task abstractions emerge in linear networks with fast and bounded units

Kai Sandbrink, Jan P. Bauer, Alexandra M. Proca, Andrew M. Saxe, Christopher Summerfield, Ali Hummos

### A Subsampling Based Neural Network for Spatial Data

Debjoy Thakur

### Designing a Linearized Potential Function in Neural Network Optimization Using Csiszár Type of Tsallis Entropy

Keito Akiyama

### Solving stochastic partial differential equations using neural networks in the Wiener chaos expansion

Ariel Neufeld, Philipp Schmocker

### Neural Networks and (Virtual) Extended Formulations

Christoph Hertrich, Georg Loho

### Confidence Calibration of Classifiers with Many Classes

Adrien Le Coz, Stéphane Herbin, Faouzi Adjed

### Gradient Descent Finds Over-Parameterized Neural Networks with Sharp Generalization for Nonparametric Regression: A Distribution-Free Analysis

Yingzhen Yang, Ping Li

### Transferable polychromatic optical encoder for neural networks

Minho Choi, Jinlin Xiang, Anna Wirth-Singh, Seung-Hwan Baek, Eli Shlizerman, Arka Majumdar

### Multi-modal deformable image registration using untrained neural networks

Quang Luong Nhat Nguyen, Ruiming Cao, Laura Waller

### Pretrained transformer efficiently learns low-dimensional target functions in-context

Kazusato Oko, Yujin Song, Taiji Suzuki, Denny Wu

### Learning predictable and robust neural representations by straightening image sequences

Xueyan Niu, Cristina Savin, Eero P. Simoncelli

### Entropy stable conservative flux form neural networks

Lizuo Liu, Tongtong Li, Anne Gelb, Yoonsang Lee

### Multilevel Monte Carlo methods for simulating forward-backward stochastic differential equations using neural networks

Oliver Sheridan-Methven

### Guiding Neural Collapse: Optimising Towards the Nearest Simplex Equiangular Tight Frame

Evan Markou, Thalaiyasingam Ajanthan, Stephen Gould

### CRONOS: Enhancing Deep Learning with Scalable GPU Accelerated Convex Neural Networks

Miria Feng, Zachary Frangella, Mert Pilanci