Low Rank Matrix
Low-rank matrix approximation focuses on representing large matrices using the product of smaller, lower-rank matrices, thereby reducing computational complexity and storage requirements. Current research emphasizes efficient algorithms for low-rank matrix completion (inferring missing entries), parameter-efficient fine-tuning of large pre-trained models (like LLMs) using low-rank updates (e.g., LoRA and its variants), and developing improved initialization strategies to enhance convergence and performance. These advancements have significant implications for various fields, including machine learning, signal processing, and data analysis, by enabling the efficient handling and analysis of high-dimensional datasets.
Papers
March 18, 2024
February 13, 2024
February 4, 2024
January 9, 2024
January 4, 2024
October 30, 2023
October 17, 2023
September 24, 2023
September 5, 2023
August 3, 2023
June 20, 2023
June 8, 2023
June 6, 2023
March 10, 2023
March 6, 2023
January 30, 2023
November 17, 2022
September 21, 2022
August 25, 2022