Neural Network
Neural networks are computational models inspired by the structure and function of the brain, primarily aimed at approximating complex functions and solving diverse problems through learning from data. Current research emphasizes improving efficiency and robustness, exploring novel architectures like sinusoidal neural fields and hybrid models combining neural networks with radial basis functions, as well as developing methods for understanding and manipulating the internal representations learned by these networks, such as through hyper-representations of network weights. These advancements are driving progress in various fields, including computer vision, natural language processing, and scientific modeling, by enabling more accurate, efficient, and interpretable AI systems.
Papers
The Role of Deep Learning Regularizations on Actors in Offline RL
Denis Tarasov, Anja Surina, Caglar Gulcehre
Optimizing Neural Network Performance and Interpretability with Diophantine Equation Encoding
Ronald Katende
E-commerce Webpage Recommendation Scheme Base on Semantic Mining and Neural Networks
Wenchao Zhao, Xiaoyi Liu, Ruilin Xu, Lingxi Xiao, Muqing Li
Constructing an Interpretable Deep Denoiser by Unrolling Graph Laplacian Regularizer
Seyed Alireza Hosseini, Tam Thuc Do, Gene Cheung, Yuichi Tanaka
Symmetry Breaking in Neural Network Optimization: Insights from Input Dimension Expansion
Jun-Jie Zhang, Nan Cheng, Fu-Peng Li, Xiu-Cheng Wang, Jian-Nan Chen, Long-Gang Pang, Deyu Meng
Distributed Cooperative AI for Large-Scale Eigenvalue Computations Using Neural Networks
Ronald Katende
Towards Robust Uncertainty-Aware Incomplete Multi-View Classification
Mulin Chen, Haojian Huang, Qiang Li
Symmetry constrained neural networks for detection and localization of damage in metal plates
James Amarel, Christopher Rudolf, Athanasios Iliopoulos, John Michopoulos, Leslie N. Smith
A Comprehensive Comparison Between ANNs and KANs For Classifying EEG Alzheimer's Data
Akshay Sunkara, Sriram Sattiraju, Aakarshan Kumar, Zaryab Kanjiani, Himesh Anumala
A general reduced-order neural operator for spatio-temporal predictive learning on complex spatial domains
Qinglu Meng, Yingguang Li, Zhiliang Deng, Xu Liu, Gengxiang Chen, Qiutong Wu, Changqing Liu, Xiaozhong Hao
On the Convergence Analysis of Over-Parameterized Variational Autoencoders: A Neural Tangent Kernel Perspective
Li Wang, Wei Huang
Early-exit Convolutional Neural Networks
Edanur Demir, Emre Akbas
RotCAtt-TransUNet++: Novel Deep Neural Network for Sophisticated Cardiac Segmentation
Quoc-Bao Nguyen-Le, Tuan-Hy Le, Anh-Triet Do, Quoc-Huy Trinh
SEF: A Method for Computing Prediction Intervals by Shifting the Error Function in Neural Networks
E. V. Aretos, D. G. Sotiropoulos
MaxCutPool: differentiable feature-aware Maxcut for pooling in graph neural networks
Carlo Abate, Filippo Maria Bianchi
From Computation to Consumption: Exploring the Compute-Energy Link for Training and Testing Neural Networks for SED Systems
Constance Douwes, Romain Serizel
Evaluating Neural Networks Architectures for Spring Reverb Modelling
Francesco Papaleo, Xavier Lizarraga-Seijas, Frederic Font
Solve paint color effect prediction problem in trajectory optimization of spray painting robot using artificial neural network inspired by the Kubelka Munk model
Hexiang Wang, Zhiyuan Bi, Zhen Cheng, Xinru Li, Jiake Zhu, Liyuan Jiang, Hao Li, Shizhou Lu
Accelerating Training with Neuron Interaction and Nowcasting Networks
Boris Knyazev, Abhinav Moudgil, Guillaume Lajoie, Eugene Belilovsky, Simon Lacoste-Julien