Neural Network
Neural networks are computational models inspired by the structure and function of the brain, primarily aimed at approximating complex functions and solving diverse problems through learning from data. Current research emphasizes improving efficiency and robustness, exploring novel architectures like sinusoidal neural fields and hybrid models combining neural networks with radial basis functions, as well as developing methods for understanding and manipulating the internal representations learned by these networks, such as through hyper-representations of network weights. These advancements are driving progress in various fields, including computer vision, natural language processing, and scientific modeling, by enabling more accurate, efficient, and interpretable AI systems.
Papers
Nesterov acceleration in benignly non-convex landscapes
Kanan Gupta, Stephan Wojtowytsch
Simultaneous Weight and Architecture Optimization for Neural Networks
Zitong Huang, Mansooreh Montazerin, Ajitesh Srivastava
Neural Material Adaptor for Visual Grounding of Intrinsic Dynamics
Junyi Cao, Shanyan Guan, Yanhao Ge, Wei Li, Xiaokang Yang, Chao Ma
A Lightweight Target-Driven Network of Stereo Matching for Inland Waterways
Jing Su, Yiqing Zhou, Yu Zhang, Chao Wang, Yi Wei
Metamizer: a versatile neural optimizer for fast and accurate physics simulations
Nils Wandel, Stefan Schulz, Reinhard Klein
Unifying and Verifying Mechanistic Interpretations: A Case Study with Group Operations
Wilson Wu, Louis Jaburi, Jacob Drori, Jason Gross
Collective variables of neural networks: empirical time evolution and scaling laws
Samuel Tovey, Sven Krippendorf, Michael Spannowsky, Konstantin Nikolaou, Christian Holm
A Generalization Bound for a Family of Implicit Networks
Samy Wu Fung, Benjamin Berkels
DCP: Learning Accelerator Dataflow for Neural Network via Propagation
Peng Xu, Wenqi Shao, Mingyu Ding, Ping Luo
MaD-Scientist: AI-based Scientist solving Convection-Diffusion-Reaction Equations Using Massive PINN-Based Prior Data
Mingu Kang, Dongseok Lee, Woojin Cho, Jaehyeon Park, Kookjin Lee, Anthony Gruber, Youngjoon Hong, Noseong Park
Gaussian-Based and Outside-the-Box Runtime Monitoring Join Forces
Vahid Hashemi, Jan Křetínský, Sabine Rieder, Torsten Schön, Jan Vorhoff
Utilizing Lyapunov Exponents in designing deep neural networks
Tirthankar Mittra
Convolutional neural networks applied to modification of images
Carlos I. Aguirre-Velez, Jose Antonio Arciniega-Nevarez, Eric Dolores-Cuenca
Residual Kolmogorov-Arnold Network for Enhanced Deep Learning
Ray Congrui Yu, Sherry Wu, Jiang Gui
Designing a Classifier for Active Fire Detection from Multispectral Satellite Imagery Using Neural Architecture Search
Amber Cassimon, Phil Reiter, Siegfried Mercelis, Kevin Mets
Hyper-Representations: Learning from Populations of Neural Networks
Konstantin Schürholt
MetaDD: Boosting Dataset Distillation with Neural Network Architecture-Invariant Generalization
Yunlong Zhao, Xiaoheng Deng, Xiu Su, Hongyan Xu, Xiuxing Li, Yijing Liu, Shan You
Function Gradient Approximation with Random Shallow ReLU Networks with Control Applications
Andrew Lamperski, Siddharth Salapaka
Detecting and Approximating Redundant Computational Blocks in Neural Networks
Irene Cannistraci, Emanuele Rodolà, Bastian Rieck