Neural Network
Neural networks are computational models inspired by the structure and function of the brain, primarily aimed at approximating complex functions and solving diverse problems through learning from data. Current research emphasizes improving efficiency and robustness, exploring novel architectures like sinusoidal neural fields and hybrid models combining neural networks with radial basis functions, as well as developing methods for understanding and manipulating the internal representations learned by these networks, such as through hyper-representations of network weights. These advancements are driving progress in various fields, including computer vision, natural language processing, and scientific modeling, by enabling more accurate, efficient, and interpretable AI systems.
Papers
BDC-Occ: Binarized Deep Convolution Unit For Binarized Occupancy Network
Zongkai Zhang, Zidong Xu, Wenming Yang, Qingmin Liao, Jing-Hao Xue
Reference Neural Operators: Learning the Smooth Dependence of Solutions of PDEs on Geometric Deformations
Ze Cheng, Zhongkai Hao, Xiaoqiang Wang, Jianing Huang, Youjia Wu, Xudan Liu, Yiru Zhao, Songming Liu, Hang Su
A Real-Time Voice Activity Detection Based On Lightweight Neural
Jidong Jia, Pei Zhao, Di Wang
Detection of decision-making manipulation in the pairwise comparisons method
Michał Strada, Sebastian Ernst, Jacek Szybowski, Konrad Kułakowski
Graph neural networks with configuration cross-attention for tensor compilers
Dmitrii Khizbullin, Eduardo Rocha de Andrade, Thanh Hau Nguyen, Matheus Pedroza Ferreira, David R. Pugh
LoQT: Low-Rank Adapters for Quantized Pretraining
Sebastian Loeschcke, Mads Toftrup, Michael J. Kastoryano, Serge Belongie, Vésteinn Snæbjarnarson
On Sequential Loss Approximation for Continual Learning
Menghao Waiyan William Zhu, Ercan Engin Kuruoğlu
Partial train and isolate, mitigate backdoor attack
Yong Li, Han Gao
High-Performance Temporal Reversible Spiking Neural Networks with $O(L)$ Training Memory and $O(1)$ Inference Cost
JiaKui Hu, Man Yao, Xuerui Qiu, Yuhong Chou, Yuxuan Cai, Ning Qiao, Yonghong Tian, Bo XU, Guoqi Li
Geometry of Critical Sets and Existence of Saddle Branches for Two-layer Neural Networks
Leyang Zhang, Yaoyu Zhang, Tao Luo
Understanding the dynamics of the frequency bias in neural networks
Juan Molina, Mircea Petrache, Francisco Sahli Costabal, Matías Courdurier
Neural Pfaffians: Solving Many Many-Electron Schrödinger Equations
Nicholas Gao, Stephan Günnemann
Bounds for the smallest eigenvalue of the NTK for arbitrary spherical data of arbitrary dimension
Kedar Karhadkar, Michael Murray, Guido Montúfar
Graphcode: Learning from multiparameter persistent homology using graph neural networks
Michael Kerber, Florian Russold
Minimum number of neurons in fully connected layers of a given neural network (the first approximation)
Oleg I. Berngardt
Automatic Differentiation is Essential in Training Neural Networks for Solving Differential Equations
Chuqi Chen, Yahong Yang, Yang Xiang, Wenrui Hao