Kernel Banach Space
Reproducing Kernel Banach Spaces (RKBSs) offer a powerful mathematical framework for analyzing neural networks, moving beyond the limitations of traditional Reproducing Kernel Hilbert Spaces (RKHSs) by accommodating the sparsity and non-Hilbert space properties of practical network architectures. Current research focuses on applying RKBS theory to deep networks, including graph convolutional networks, and establishing representer theorems that justify the use of finite-width architectures. This theoretical advancement provides a deeper understanding of neural network function spaces, leading to insights into generalization, optimization, and network compression techniques.
Papers
October 14, 2024
August 9, 2024
March 13, 2024
March 5, 2024
November 6, 2023
October 5, 2023
May 25, 2023
May 21, 2023
February 1, 2023