Neural Network
Neural networks are computational models inspired by the structure and function of the brain, primarily aimed at approximating complex functions and solving diverse problems through learning from data. Current research emphasizes improving efficiency and robustness, exploring novel architectures like sinusoidal neural fields and hybrid models combining neural networks with radial basis functions, as well as developing methods for understanding and manipulating the internal representations learned by these networks, such as through hyper-representations of network weights. These advancements are driving progress in various fields, including computer vision, natural language processing, and scientific modeling, by enabling more accurate, efficient, and interpretable AI systems.
Papers
Stochastic stem bucking using mixture density neural networks
Simon Schmiedel
FOXANN: A Method for Boosting Neural Network Performance
Mahmood A. Jumaah, Yossra H. Ali, Tarik A. Rashid, S. Vimal
KHNNs: hypercomplex neural networks computations via Keras using TensorFlow and PyTorch
Agnieszka Niemczynowicz, Radosław Antoni Kycia
Fully tensorial approach to hypercomplex neural networks
Agnieszka Niemczynowicz, Radosław Antoni Kycia
Axiomatization of Gradient Smoothing in Neural Networks
Linjiang Zhou, Xiaochuan Shi, Chao Ma, Zepeng Wang
TabSketchFM: Sketch-based Tabular Representation Learning for Data Discovery over Data Lakes
Aamod Khatiwada, Harsha Kokel, Ibrahim Abdelaziz, Subhajit Chaudhury, Julian Dolby, Oktie Hassanzadeh, Zhenhan Huang, Tejaswini Pedapati, Horst Samulowitz, Kavitha Srinivas
RepAct: The Re-parameterizable Adaptive Activation Function
Xian Wu, Qingchuan Tao, Shuang Wang
DPEC: Dual-Path Error Compensation Method for Enhanced Low-Light Image Clarity
Shuang Wang, Qianwen Lu, Yihe Nie, Qingchuan Tao, Yanmei Yu
MCNC: Manifold Constrained Network Compression
Chayne Thrash, Ali Abbasi, Parsa Nooralinejad, Soroush Abbasi Koohpayegani, Reed Andreas, Hamed Pirsiavash, Soheil Kolouri
Advancing operational PM2.5 forecasting with dual deep neural networks (D-DNet)
Shengjuan Cai, Fangxin Fang, Vincent-Henri Peuch, Mihai Alexe, Ionel Michael Navon, Yanghua Wang
Super-resolution imaging using super-oscillatory diffractive neural networks
Hang Chen, Sheng Gao, Zejia Zhao, Zhengyang Duan, Haiou Zhang, Gordon Wetzstein, Xing Lin
Dimensions underlying the representational alignment of deep neural networks with humans
Florian P. Mahner, Lukas Muttenthaler, Umut Güçlü, Martin N. Hebart
Semi-adaptive Synergetic Two-way Pseudoinverse Learning System
Binghong Liu, Ziqi Zhao, Shupan Li, Ke Wang
On Reducing Activity with Distillation and Regularization for Energy Efficient Spiking Neural Networks
Thomas Louis, Benoit Miramond, Alain Pegatoquet, Adrien Girard
Quantum-tunnelling deep neural networks for sociophysical neuromorphic AI
Ivan S. Maksymov
Multimodal Reaching-Position Prediction for ADL Support Using Neural Networks
Yutaka Takase, Kimitoshi Yamazaki
Learning Neural Networks with Sparse Activations
Pranjal Awasthi, Nishanth Dikkala, Pritish Kamath, Raghu Meka
Why Line Search when you can Plane Search? SO-Friendly Neural Networks allow Per-Iteration Optimization of Learning and Momentum Rates for Every Layer
Betty Shea, Mark Schmidt
Embedded event based object detection with spiking neural network
Jonathan Courtois, Pierre-Emmanuel Novac, Edgar Lemaire, Alain Pegatoquet, Benoit Miramond
Early learning of the optimal constant solution in neural networks and humans
Jirko Rubruck, Jan P. Bauer, Andrew Saxe, Christopher Summerfield