Discrete Latent Representation
Discrete latent representation focuses on encoding data into discrete, rather than continuous, latent variables, aiming for improved interpretability, disentanglement of factors, and efficient data compression. Current research emphasizes vector quantization (VQ)-based autoencoders and variational autoencoders (VAEs), often incorporating techniques like residual vector quantization and autoregressive modeling to enhance the quality and diversity of generated data. These methods find applications across diverse fields, including speech processing, time series analysis, image generation, and human pose estimation, offering improvements in data generation, representation learning, and downstream task performance.
Papers
September 13, 2024
August 29, 2024
May 31, 2024
December 13, 2023
July 26, 2023
June 13, 2023
March 14, 2023
February 12, 2023
January 16, 2023
May 9, 2022
November 24, 2021
November 10, 2021