Neural Tangent Kernel
The Neural Tangent Kernel (NTK) framework provides a powerful analytical tool for understanding the training dynamics of wide neural networks, essentially treating them as kernel methods in the infinite-width limit. Current research focuses on applying NTK to analyze various model architectures and training algorithms, including variational autoencoders, federated learning, and physics-informed neural networks, investigating issues like spectral bias, convergence rates, and generalization performance. This work offers valuable insights into the optimization and generalization properties of deep learning models, potentially leading to improved training strategies and a deeper understanding of their behavior.
Papers
March 17, 2022
March 14, 2022
March 10, 2022
February 27, 2022
February 13, 2022
February 10, 2022
February 1, 2022
January 28, 2022
January 20, 2022
January 12, 2022
December 31, 2021
December 15, 2021
December 7, 2021
December 4, 2021
November 28, 2021
November 11, 2021