Neural Tangent Kernel
The Neural Tangent Kernel (NTK) framework provides a powerful analytical tool for understanding the training dynamics of wide neural networks, essentially treating them as kernel methods in the infinite-width limit. Current research focuses on applying NTK to analyze various model architectures and training algorithms, including variational autoencoders, federated learning, and physics-informed neural networks, investigating issues like spectral bias, convergence rates, and generalization performance. This work offers valuable insights into the optimization and generalization properties of deep learning models, potentially leading to improved training strategies and a deeper understanding of their behavior.
Papers
January 3, 2024
January 2, 2024
December 20, 2023
December 6, 2023
December 4, 2023
November 2, 2023
October 28, 2023
October 25, 2023
October 21, 2023
October 19, 2023
October 12, 2023
October 9, 2023
September 29, 2023
September 26, 2023
September 14, 2023
September 9, 2023
September 8, 2023