Kernel Banach Space

Reproducing Kernel Banach Spaces (RKBSs) offer a powerful mathematical framework for analyzing neural networks, moving beyond the limitations of traditional Reproducing Kernel Hilbert Spaces (RKHSs) by accommodating the sparsity and non-Hilbert space properties of practical network architectures. Current research focuses on applying RKBS theory to deep networks, including graph convolutional networks, and establishing representer theorems that justify the use of finite-width architectures. This theoretical advancement provides a deeper understanding of neural network function spaces, leading to insights into generalization, optimization, and network compression techniques.

Papers