Neural Network
Neural networks are computational models inspired by the structure and function of the brain, primarily aimed at approximating complex functions and solving diverse problems through learning from data. Current research emphasizes improving efficiency and robustness, exploring novel architectures like sinusoidal neural fields and hybrid models combining neural networks with radial basis functions, as well as developing methods for understanding and manipulating the internal representations learned by these networks, such as through hyper-representations of network weights. These advancements are driving progress in various fields, including computer vision, natural language processing, and scientific modeling, by enabling more accurate, efficient, and interpretable AI systems.
Papers
WaterMAS: Sharpness-Aware Maximization for Neural Network Watermarking
Carl De Sousa Trias, Mihai Mitrea, Attilio Fiandrotti, Marco Cagnazzo, Sumanta Chaudhuri, Enzo Tartaglione
Memory-Optimized Once-For-All Network
Maxime Girard, Victor Quétu, Samuel Tardieu, Van-Tam Nguyen, Enzo Tartaglione
Dynamics of Supervised and Reinforcement Learning in the Non-Linear Perceptron
Christian Schmid, James M. Murray
Reducing Bias in Deep Learning Optimization: The RSGDM Approach
Honglin Qin, Hongye Zheng, Bingxing Wang, Zhizhong Wu, Bingyao Liu, Yuanfang Yang
Shuffle Vision Transformer: Lightweight, Fast and Efficient Recognition of Driver Facial Expression
Ibtissam Saadi, Douglas W. Cunningham, Taleb-ahmed Abdelmalik, Abdenour Hadid, Yassin El Hillali
Weight Conditioning for Smooth Optimization of Neural Networks
Hemanth Saratchandran, Thomas X. Wang, Simon Lucey
Hyperbolic Brain Representations
Alexander Joseph, Nathan Francis, Meijke Balay
Boundless: Generating Photorealistic Synthetic Data for Object Detection in Urban Streetscapes
Mehmet Kerem Turkcan, Yuyang Li, Chengbo Zang, Javad Ghaderi, Gil Zussman, Zoran Kostic
SNNAX -- Spiking Neural Networks in JAX
Jamie Lohoff, Jan Finkbeiner, Emre Neftci
Neural Networks with LSTM and GRU in Modeling Active Fires in the Amazon
Ramon Tavares, Ricardo Olinda
K-Origins: Better Colour Quantification for Neural Networks
Lewis Mason, Mark Martinez
GradINN: Gradient Informed Neural Network
Filippo Aglietti, Francesco Della Santa, Andrea Piano, Virginia Aglietti
Decoding finger velocity from cortical spike trains with recurrent spiking neural networks
Tengjun Liu, Julia Gygax, Julian Rossbroich, Yansong Chua, Shaomin Zhang, Friedemann Zenke
Frequency-Spatial Entanglement Learning for Camouflaged Object Detection
Yanguang Sun, Chunyan Xu, Jian Yang, Hanyu Xuan, Lei Luo
AQ-PINNs: Attention-Enhanced Quantum Physics-Informed Neural Networks for Carbon-Efficient Climate Modeling
Siddhant Dutta, Nouhaila Innan, Sadok Ben Yahia, Muhammad Shafique
DAPONet: A Dual Attention and Partially Overparameterized Network for Real-Time Road Damage Detection
Weichao Pan, Jiaju Kang, Xu Wang, Zhihao Chen, Yiyuan Ge
Quantifying Emergence in Neural Networks: Insights from Pruning and Training Dynamics
Faisal AlShinaifi, Zeyad Almoaigel, Johnny Jingze Li, Abdulla Kuleib, Gabriel A. Silva
Hybridization of Persistent Homology with Neural Networks for Time-Series Prediction: A Case Study in Wave Height
Zixin Lin, Nur Fariha Syaqina Zulkepli, Mohd Shareduwan Mohd Kasihmuddin, R. U. Gobithaasan