Neural Network
Neural networks are computational models inspired by the structure and function of the brain, primarily aimed at approximating complex functions and solving diverse problems through learning from data. Current research emphasizes improving efficiency and robustness, exploring novel architectures like sinusoidal neural fields and hybrid models combining neural networks with radial basis functions, as well as developing methods for understanding and manipulating the internal representations learned by these networks, such as through hyper-representations of network weights. These advancements are driving progress in various fields, including computer vision, natural language processing, and scientific modeling, by enabling more accurate, efficient, and interpretable AI systems.
Papers
Provable Robustness of (Graph) Neural Networks Against Data Poisoning and Backdoor Attacks
Lukas Gosch, Mahalakshmi Sabanayagam, Debarghya Ghoshdastidar, Stephan Günnemann
Data-Guided Physics-Informed Neural Networks for Solving Inverse Problems in Partial Differential Equations
Wei Zhou, Y. F. Xu
Towards Robust Event-based Networks for Nighttime via Unpaired Day-to-Night Event Translation
Yuhwan Jeong, Hoonhee Cho, Kuk-Jin Yoon
Evolved Developmental Artificial Neural Networks for Multitasking with Advanced Activity Dependence
Yintong Zhang, Jason A. Yoder
Order parameters and phase transitions of continual learning in deep neural networks
Haozhe Shan, Qianyi Li, Haim Sompolinsky
Deep Learning Activation Functions: Fixed-Shape, Parametric, Adaptive, Stochastic, Miscellaneous, Non-Standard, Ensemble
M. M. Hammad
Unexpected Benefits of Self-Modeling in Neural Systems
Vickram N. Premakumar, Michael Vaiana, Florin Pop, Judd Rosenblatt, Diogo Schwerz de Lucena, Kirsten Ziman, Michael S. A. Graziano
Partial-differential-algebraic equations of nonlinear dynamics by Physics-Informed Neural-Network: (I) Operator splitting and framework assessment
Loc Vu-Quoc, Alexander Humer
Stabilizing Dynamic Systems through Neural Network Learning: A Robust Approach
Yu Zhang, Haoyu Zhang, Yongxiang Zou, Houcheng Li, Long Cheng
Investigating Low-Rank Training in Transformer Language Models: Efficiency and Scaling Analysis
Xiuying Wei, Skander Moalla, Razvan Pascanu, Caglar Gulcehre
Team up GBDTs and DNNs: Advancing Efficient and Effective Tabular Prediction with Tree-hybrid MLPs
Jiahuan Yan, Jintai Chen, Qianxing Wang, Danny Z. Chen, Jian Wu
Weight Block Sparsity: Training, Compilation, and AI Engine Accelerators
Paolo D'Alberto, Taehee Jeong, Akshai Jain, Shreyas Manjunath, Mrinal Sarmah, Samuel Hsu, Yaswanth Raparti, Nitesh Pipralia
On Exact Bit-level Reversible Transformers Without Changing Architectures
Guoqiang Zhang, J.P. Lewis, W. B. Kleijn
Evaluating Deep Neural Networks in Deployment (A Comparative and Replicability Study)
Eduard Pinconschi, Divya Gopinath, Rui Abreu, Corina S. Pasareanu
SwishReLU: A Unified Approach to Activation Functions for Enhanced Deep Neural Networks Performance
Jamshaid Ul Rahman, Rubiqa Zulfiqar, Asad Khan, Nimra
Transfer Learning for Wildlife Classification: Evaluating YOLOv8 against DenseNet, ResNet, and VGGNet on a Custom Dataset
Subek Sharma, Sisir Dhakal, Mansi Bhavsar
Using Low-Discrepancy Points for Data Compression in Machine Learning: An Experimental Comparison
Simone Göttlich, Jacob Heieck, Andreas Neuenkirch
Explaining Spectrograms in Machine Learning: A Study on Neural Networks for Speech Classification
Jesin James, Balamurali B. T., Binu Abeysinghe, Junchen Liu
INSIGHT: Universal Neural Simulator for Analog Circuits Harnessing Autoregressive Transformers
Souradip Poddar, Youngmin Oh, Yao Lai, Hanqing Zhu, Bosun Hwang, David Z. Pan