Neural Network
Neural networks are computational models inspired by the structure and function of the brain, primarily aimed at approximating complex functions and solving diverse problems through learning from data. Current research emphasizes improving efficiency and robustness, exploring novel architectures like sinusoidal neural fields and hybrid models combining neural networks with radial basis functions, as well as developing methods for understanding and manipulating the internal representations learned by these networks, such as through hyper-representations of network weights. These advancements are driving progress in various fields, including computer vision, natural language processing, and scientific modeling, by enabling more accurate, efficient, and interpretable AI systems.
Papers - Page 30
Investigating the Gestalt Principle of Closure in Deep Convolutional Neural Networks
Yuyan Zhang, Derya Soydaner, Fatemeh Behrad, Lisa Koßmann, Johan WagemansDifferentiable architecture search with multi-dimensional attention for spiking neural networks
Yilei Man, Linhai Xie, Shushan Qiao, Yumei Zhou, Delong ShangAdvantages of Neural Population Coding for Deep Learning
Heiko HoffmannHow many classifiers do we need?
Hyunsuk Kim, Liam Hodgkinson, Ryan Theisen, Michael W. Mahoney
Deep Learning Through A Telescoping Lens: A Simple Model Provides Empirical Insights On Grokking, Gradient Boosting & Beyond
Alan Jeffares, Alicia Curth, Mihaela van der SchaarProtecting Feed-Forward Networks from Adversarial Attacks Using Predictive Coding
Ehsan Ganjidoost, Jeff OrchardLearning local discrete features in explainable-by-design convolutional neural networks
Pantelis I. Kaplanoglou, Konstantinos DiamantarasUnderstanding Generalizability of Diffusion Models Requires Rethinking the Hidden Gaussian Structure
Xiang Li, Yixiang Dai, Qing QuClustering Head: A Visual Case Study of the Training Dynamics in Transformers
Ambroise Odonnat, Wassim Bouaziz, Vivien CabannesNeural Network Verification with PyRAT
Augustin Lemesle, Julien Lehmann, Tristan Le GallDynaSplit: A Hardware-Software Co-Design Framework for Energy-Aware Inference on Edge
Daniel May, Alessandro Tundo, Shashikant Ilager, Ivona BrandicNoise as a Double-Edged Sword: Reinforcement Learning Exploits Randomized Defenses in Neural Networks
Steve Bakos, Pooria Madani, Heidar DavoudiReducing Oversmoothing through Informed Weight Initialization in Graph Neural Networks
Dimitrios Kelesis, Dimitris Fotakis, Georgios PaliourasSyno: Structured Synthesis for Neural Operators
Yongqi Zhuo, Zhengyuan Su, Chenggang Zhao, Mingyu Gao
Decoding Fatigue Levels of Pilots Using EEG Signals with Hybrid Deep Neural Networks
Dae-Hyeok Lee, Sung-Jin Kim, Si-Hyun KimDASH: Warm-Starting Neural Network Training in Stationary Settings without Loss of Plasticity
Baekrok Shin, Junsoo Oh, Hanseul Cho, Chulhee YunEnsemble learning of the atrial fiber orientation with physics-informed neural networks
Efraín Magaña, Simone Pezzuto, Francisco Sahli CostabalNon-binary artificial neuron with phase variation implemented on a quantum computer
Jhordan Silveira de Borba, Jonas MazieroTightening convex relaxations of trained neural networks: a unified approach for convex and S-shaped activations
Pablo Carrasco, Gonzalo Muñoz