Neural Tangent
Neural tangent kernels (NTKs) provide a framework for understanding the behavior of neural networks, particularly in the "lazy training" regime where weights change minimally. Current research focuses on leveraging NTKs to analyze feature learning, improve continual learning (e.g., mitigating catastrophic forgetting), and enhance the robustness of models against adversarial attacks. This line of inquiry offers valuable insights into the inner workings of deep learning models, potentially leading to more efficient training algorithms and improved model generalization.
Papers
October 31, 2024
August 30, 2024
June 5, 2024
February 7, 2024
August 29, 2023
August 25, 2023
August 3, 2023
July 16, 2023
October 10, 2022
August 9, 2022
June 17, 2022