Compact Representation
Compact representation research focuses on efficiently encoding complex data, such as images, language models, and 3D point clouds, into smaller, more manageable formats while preserving essential information. Current efforts utilize various techniques, including transformer-based architectures, implicit neural representations, and clustering methods like k-means, often incorporating low-rank approximations or other dimensionality reduction strategies to achieve this compression. These advancements are crucial for improving the efficiency and scalability of numerous applications, ranging from digital pathology and robotics to large language model evaluation and machine learning model training.
Papers
October 3, 2024
September 6, 2024
July 8, 2024
June 8, 2024
May 2, 2024
March 20, 2024
March 18, 2024
January 14, 2024
December 31, 2023
November 2, 2023
October 16, 2023
July 4, 2023
May 24, 2023
December 29, 2022
November 12, 2022
September 19, 2022
August 18, 2022
July 28, 2022
June 14, 2022