Empirical Neural Tangent Kernel

The empirical neural tangent kernel (eNTK) approximates the behavior of neural networks during training, offering a valuable tool for understanding their generalization performance and implicit regularization. Current research focuses on improving eNTK approximations for efficiency and accuracy, exploring its relationship to phenomena like neural collapse and leveraging it for tasks such as data selection and federated learning. This work is significant because it bridges the theory-practice gap in deep learning, providing insights into model behavior and enabling the development of more robust and efficient training methods. The eNTK's application in explainable AI and improved federated learning algorithms highlights its practical impact.

Papers