Linear Connectivity
Linear connectivity in neural networks investigates the existence and nature of low-loss paths connecting different trained models in parameter space. Current research focuses on understanding the role of permutation symmetries and developing algorithms to efficiently find or create these connections, particularly examining their impact on generalization and the optimization landscape. This research aims to improve our understanding of neural network training dynamics and potentially lead to more efficient training methods and improved model robustness. The findings are relevant to both theoretical understanding of deep learning and practical applications like model interpolation and ensemble methods.
Papers
April 9, 2024
August 15, 2023
July 17, 2023
November 15, 2022