Variational Transformer
Variational Transformers are a class of neural network models combining the power of variational autoencoders (VAEs) with the attention mechanisms of transformers to address challenges in diverse data generation and compression tasks. Current research focuses on applying these models to complex sequential data, such as multivariate time series, human motion, and climate data, often incorporating hierarchical structures or layer-wise latent variable inference to improve performance. This approach offers significant potential for improving data efficiency in resource-intensive applications like weather forecasting and enabling more realistic and diverse generation in areas such as 3D animation and text synthesis.
Papers
August 20, 2024
May 6, 2024
March 8, 2024
September 7, 2023
September 2, 2022
July 13, 2022
May 28, 2022