Neural Network
Neural networks are computational models inspired by the structure and function of the brain, primarily aimed at approximating complex functions and solving diverse problems through learning from data. Current research emphasizes improving efficiency and robustness, exploring novel architectures like sinusoidal neural fields and hybrid models combining neural networks with radial basis functions, as well as developing methods for understanding and manipulating the internal representations learned by these networks, such as through hyper-representations of network weights. These advancements are driving progress in various fields, including computer vision, natural language processing, and scientific modeling, by enabling more accurate, efficient, and interpretable AI systems.
Papers
Tool Shape Optimization through Backpropagation of Neural Network
Kento Kawaharazuka, Toru Ogawa, Cota Nabeshima
Efficient Training with Denoised Neural Weights
Yifan Gong, Zheng Zhan, Yanyu Li, Yerlan Idelbayev, Andrey Zharkov, Kfir Aberman, Sergey Tulyakov, Yanzhi Wang, Jian Ren
DFDRNN: A dual-feature based neural network for drug repositioning
Enqiang Zhu, Xiang Li, Chanjuan Liu, Nikhil R. Pal
Preconditioned Gradient Descent Finds Over-Parameterized Neural Networks with Sharp Generalization for Nonparametric Regression
Yingzhen Yang
Provable Robustness of (Graph) Neural Networks Against Data Poisoning and Backdoor Attacks
Lukas Gosch, Mahalakshmi Sabanayagam, Debarghya Ghoshdastidar, Stephan Günnemann
Data-Guided Physics-Informed Neural Networks for Solving Inverse Problems in Partial Differential Equations
Wei Zhou, Y. F. Xu
Towards Robust Event-based Networks for Nighttime via Unpaired Day-to-Night Event Translation
Yuhwan Jeong, Hoonhee Cho, Kuk-Jin Yoon
Evolved Developmental Artificial Neural Networks for Multitasking with Advanced Activity Dependence
Yintong Zhang, Jason A. Yoder
Order parameters and phase transitions of continual learning in deep neural networks
Haozhe Shan, Qianyi Li, Haim Sompolinsky
Deep Learning Activation Functions: Fixed-Shape, Parametric, Adaptive, Stochastic, Miscellaneous, Non-Standard, Ensemble
M. M. Hammad
Unexpected Benefits of Self-Modeling in Neural Systems
Vickram N. Premakumar, Michael Vaiana, Florin Pop, Judd Rosenblatt, Diogo Schwerz de Lucena, Kirsten Ziman, Michael S. A. Graziano
Partial-differential-algebraic equations of nonlinear dynamics by Physics-Informed Neural-Network: (I) Operator splitting and framework assessment
Loc Vu-Quoc, Alexander Humer
Stabilizing Dynamic Systems through Neural Network Learning: A Robust Approach
Yu Zhang, Haoyu Zhang, Yongxiang Zou, Houcheng Li, Long Cheng
Investigating Low-Rank Training in Transformer Language Models: Efficiency and Scaling Analysis
Xiuying Wei, Skander Moalla, Razvan Pascanu, Caglar Gulcehre
Team up GBDTs and DNNs: Advancing Efficient and Effective Tabular Prediction with Tree-hybrid MLPs
Jiahuan Yan, Jintai Chen, Qianxing Wang, Danny Z. Chen, Jian Wu
Weight Block Sparsity: Training, Compilation, and AI Engine Accelerators
Paolo D'Alberto, Taehee Jeong, Akshai Jain, Shreyas Manjunath, Mrinal Sarmah, Samuel Hsu, Yaswanth Raparti, Nitesh Pipralia
On Exact Bit-level Reversible Transformers Without Changing Architectures
Guoqiang Zhang, J.P. Lewis, W. B. Kleijn
Evaluating Deep Neural Networks in Deployment (A Comparative and Replicability Study)
Eduard Pinconschi, Divya Gopinath, Rui Abreu, Corina S. Pasareanu
SwishReLU: A Unified Approach to Activation Functions for Enhanced Deep Neural Networks Performance
Jamshaid Ul Rahman, Rubiqa Zulfiqar, Asad Khan, Nimra