KAN Layer
Kolmogorov-Arnold Networks (KANs) are a novel neural network architecture designed as an alternative to multilayer perceptrons (MLPs), offering potentially improved accuracy, interpretability, and efficiency in various machine learning tasks. Current research focuses on comparing KANs to MLPs across diverse applications, including scientific modeling (e.g., differential equation solving, physics simulations), image processing, and time series forecasting, often integrating KANs into existing architectures like U-Nets. The improved performance and interpretability of KANs, particularly in handling complex non-linear relationships and symbolic representations, show promise for advancing scientific discovery and practical applications across multiple fields.
Papers
U-KAN Makes Strong Backbone for Medical Image Segmentation and Generation
Chenxin Li, Xinyu Liu, Wuyang Li, Cheng Wang, Hengyu Liu, Yixuan Yuan
A comprehensive and FAIR comparison between MLP and KAN representations for differential equations and operator networks
Khemraj Shukla, Juan Diego Toscano, Zhicheng Wang, Zongren Zou, George Em Karniadakis