Neural Tangent

Neural tangent kernels (NTKs) provide a framework for understanding the behavior of neural networks, particularly in the "lazy training" regime where weights change minimally. Current research focuses on leveraging NTKs to analyze feature learning, improve continual learning (e.g., mitigating catastrophic forgetting), and enhance the robustness of models against adversarial attacks. This line of inquiry offers valuable insights into the inner workings of deep learning models, potentially leading to more efficient training algorithms and improved model generalization.

Papers