Weight Space
Weight space research explores the properties and potential uses of the space of all possible weight configurations for neural networks. Current efforts focus on developing methods to efficiently represent, process, and learn from these weight spaces, including the design of equivariant architectures and the application of techniques like low-rank adaptations and weight fusion. This research is significant because it offers avenues for improving model efficiency, generalization, and interpretability, with applications ranging from efficient fine-tuning of large language models to enhanced out-of-distribution generalization and the creation of novel model editing techniques.
Papers
November 11, 2024
October 14, 2024
October 10, 2024
October 9, 2024
October 5, 2024
September 25, 2024
August 10, 2024
June 14, 2024
June 13, 2024
May 27, 2024
April 23, 2024
February 7, 2024
February 6, 2024
January 26, 2024
November 15, 2023
September 29, 2023
February 9, 2023
January 30, 2023
July 22, 2022