Vector Quantization
Vector quantization (VQ) is a technique for representing continuous data using a discrete set of codebook vectors, aiming to achieve efficient data compression and representation learning. Current research focuses on improving VQ's performance in various applications, including image and audio generation, speech synthesis, and large language model compression, often employing novel architectures like autoregressive transformers and energy-based models to enhance accuracy and efficiency. The impact of VQ extends across diverse fields, from improving the interpretability and robustness of AI systems to enabling efficient compression and processing of large datasets in resource-constrained environments.
Papers
May 23, 2024
May 21, 2024
May 13, 2024
April 22, 2024
April 17, 2024
April 11, 2024
March 18, 2024
March 15, 2024
March 12, 2024
February 23, 2024
February 8, 2024
February 6, 2024
February 4, 2024
January 26, 2024
December 31, 2023
December 8, 2023
December 6, 2023
November 30, 2023
November 14, 2023
October 22, 2023