Neural Tangent Kernel
The Neural Tangent Kernel (NTK) framework provides a powerful analytical tool for understanding the training dynamics of wide neural networks, essentially treating them as kernel methods in the infinite-width limit. Current research focuses on applying NTK to analyze various model architectures and training algorithms, including variational autoencoders, federated learning, and physics-informed neural networks, investigating issues like spectral bias, convergence rates, and generalization performance. This work offers valuable insights into the optimization and generalization properties of deep learning models, potentially leading to improved training strategies and a deeper understanding of their behavior.
Papers
May 23, 2024
May 19, 2024
May 13, 2024
May 6, 2024
April 23, 2024
April 19, 2024
March 26, 2024
March 19, 2024
March 15, 2024
March 13, 2024
March 5, 2024
March 1, 2024
February 27, 2024
February 24, 2024
February 19, 2024
February 10, 2024
February 7, 2024
February 6, 2024
February 5, 2024
January 16, 2024