Kolmogorov Arnold Network
Kolmogorov-Arnold Networks (KANs) are a novel type of neural network architecture that uses learnable activation functions placed on edges, rather than nodes, offering a potential alternative to traditional Multi-Layer Perceptrons (MLPs). Current research focuses on improving KAN efficiency and accuracy through variations like SincKANs and EKANs (Equivariant KANs), exploring their application in diverse fields such as image processing, function approximation, and solving partial differential equations. The significance of KANs lies in their potential for enhanced interpretability and performance in specific tasks, although comparisons with MLPs reveal varying degrees of success depending on the application and dataset characteristics.
Papers
Evaluating Federated Kolmogorov-Arnold Networks on Non-IID Data
Arthur Mendonça Sasse, Claudio Miceli de Farias
Kolmogorov-Arnold Neural Networks for High-Entropy Alloys Design
Yagnik Bandyopadhyay, Harshil Avlani, Houlong L. Zhuang
The Proof of Kolmogorov-Arnold May Illuminate Neural Network Learning
Michael H. Freedman
Baseflow identification via explainable AI with Kolmogorov-Arnold networks
Chuyang Liu, Tirthankar Roy, Daniel M. Tartakovsky, Dipankar Dwivedi
On the Convergence of (Stochastic) Gradient Descent for Kolmogorov--Arnold Networks
Yihang Gao, Vincent Y. F. Tan
Generalization Bounds and Model Complexity for Kolmogorov-Arnold Networks
Xianyang Zhang, Huijuan Zhou
Kolmogorov-Arnold Network Autoencoders
Mohammadamin Moradi, Shirin Panahi, Erik Bollt, Ying-Cheng Lai
Model Comparisons: XNet Outperforms KAN
Xin Li, Zhihong Jeff Xia, Xiaotao Zheng
Deep Learning Alternatives of the Kolmogorov Superposition Theorem
Leonardo Ferreira Guilhoto, Paris Perdikaris
On the expressiveness and spectral bias of KANs
Yixuan Wang, Jonathan W. Siegel, Ziming Liu, Thomas Y. Hou
Uncertainty Quantification with Bayesian Higher Order ReLU KANs
James Giroux, Cristiano Fanelli