Neural Network
Neural networks are computational models inspired by the structure and function of the brain, primarily aimed at approximating complex functions and solving diverse problems through learning from data. Current research emphasizes improving efficiency and robustness, exploring novel architectures like sinusoidal neural fields and hybrid models combining neural networks with radial basis functions, as well as developing methods for understanding and manipulating the internal representations learned by these networks, such as through hyper-representations of network weights. These advancements are driving progress in various fields, including computer vision, natural language processing, and scientific modeling, by enabling more accurate, efficient, and interpretable AI systems.
Papers
Loss Landscape Characterization of Neural Networks without Over-Parametrziation
Rustem Islamov, Niccolò Ajroldi, Antonio Orvieto, Aurelien Lucchi
Stable Diffusion with Continuous-time Neural Network
Andras Horvath
LPUF-AuthNet: A Lightweight PUF-Based IoT Authentication via Tandem Neural Networks and Split Learning
Brahim Mefgouda, Raviha Khan, Omar Alhussein, Hani Saleh, Hossien B. Eldeeb, Anshul Pandey, Sami Muhaidat
DiffGAN: A Test Generation Approach for Differential Testing of Deep Neural Networks
Zohreh Aghababaeyan, Manel Abdellatif, Lionel Briand, Ramesh S
G-Designer: Architecting Multi-agent Communication Topologies via Graph Neural Networks
Guibin Zhang, Yanwei Yue, Xiangguo Sun, Guancheng Wan, Miao Yu, Junfeng Fang, Kun Wang, Dawei Cheng
Degradation Oriented and Regularized Network for Real-World Depth Super-Resolution
Zhengxue Wang, Zhiqiang Yan
Are High-Degree Representations Really Unnecessary in Equivariant Graph Neural Networks?
Jiacheng Cen, Anyi Li, Ning Lin, Yuxiang Ren, Zihe Wang, Wenbing Huang
Error Diffusion: Post Training Quantization with Block-Scaled Number Formats for Neural Networks
Alireza Khodamoradi, Kristof Denolf, Eric Dellinger
Statistical Properties of Deep Neural Networks with Dependent Data
Chad Brown
Towards a More Complete Theory of Function Preserving Transforms
Michael Painter
Hard-Constrained Neural Networks with Universal Approximation Guarantees
Youngjae Min, Anoopkumar Sonar, Navid Azizan
Dynamical loss functions shape landscape topography and improve learning in artificial neural networks
Eduardo Lavin, Miguel Ruiz-Garcia
Neural networks that overcome classic challenges through practice
Kazuki Irie, Brenden M. Lake
Feature Averaging: An Implicit Bias of Gradient Descent Leading to Non-Robustness in Neural Networks
Binghui Li, Zhixuan Pan, Kaifeng Lyu, Jian Li
Fast and Accurate Neural Rendering Using Semi-Gradients
In-Young Cho, Jaewoong Cho
DuoDiff: Accelerating Diffusion Models with a Dual-Backbone Approach
Daniel Gallo Fernández, Rǎzvan-Andrei Matişan, Alejandro Monroy Muñoz, Ana-Maria Vasilcoiu, Janusz Partyka, Tin Hadži Veljković, Metod Jazbec
Structure of Artificial Neural Networks -- Empirical Investigations
Julian Stier
PrivQuant: Communication-Efficient Private Inference with Quantized Network/Protocol Co-Optimization
Tianshi Xu, Shuzhang Zhong, Wenxuan Zeng, Runsheng Wang, Meng Li