Neural Collapse
Neural collapse (NC) describes a surprising geometric structure consistently observed in the final layers of deep neural networks during training, where features cluster and align with classifier weights in a highly symmetric manner. Current research focuses on understanding NC's emergence across various architectures and tasks, including classification, regression, and even language modeling, often employing unconstrained feature models for theoretical analysis. This phenomenon has implications for improving model generalization, fairness, and robustness, as well as for developing novel training strategies and out-of-distribution detection methods.
Papers
May 19, 2023
April 18, 2023
March 27, 2023
March 17, 2023
March 11, 2023
March 10, 2023
February 24, 2023
February 6, 2023
January 3, 2023
January 1, 2023
December 23, 2022
October 29, 2022
October 27, 2022
October 4, 2022
September 27, 2022
September 19, 2022
September 17, 2022
August 10, 2022
August 4, 2022