Neural Tangent Kernel
The Neural Tangent Kernel (NTK) framework provides a powerful analytical tool for understanding the training dynamics of wide neural networks, essentially treating them as kernel methods in the infinite-width limit. Current research focuses on applying NTK to analyze various model architectures and training algorithms, including variational autoencoders, federated learning, and physics-informed neural networks, investigating issues like spectral bias, convergence rates, and generalization performance. This work offers valuable insights into the optimization and generalization properties of deep learning models, potentially leading to improved training strategies and a deeper understanding of their behavior.
Papers
Understanding Reconstruction Attacks with the Neural Tangent Kernel and Dataset Distillation
Noel Loo, Ramin Hasani, Mathias Lechner, Alexander Amini, Daniela Rus
Over-parameterised Shallow Neural Networks with Asymmetrical Node Scaling: Global Convergence Guarantees and Feature Learning
Francois Caron, Fadhel Ayed, Paul Jung, Hoil Lee, Juho Lee, Hongseok Yang