Neural Collapse
Neural collapse (NC) describes a surprising geometric structure consistently observed in the final layers of deep neural networks during training, where features cluster and align with classifier weights in a highly symmetric manner. Current research focuses on understanding NC's emergence across various architectures and tasks, including classification, regression, and even language modeling, often employing unconstrained feature models for theoretical analysis. This phenomenon has implications for improving model generalization, fairness, and robustness, as well as for developing novel training strategies and out-of-distribution detection methods.
Papers
November 9, 2023
November 2, 2023
October 24, 2023
October 13, 2023
October 12, 2023
October 10, 2023
October 9, 2023
September 30, 2023
September 27, 2023
September 18, 2023
September 9, 2023
August 5, 2023
August 3, 2023
July 4, 2023
June 29, 2023
June 28, 2023
June 6, 2023
May 25, 2023