Autoregressive Transformer
Autoregressive transformers are a class of neural network models that generate sequential data by predicting the next element in a sequence, one at a time. Current research focuses on improving their efficiency and applicability to diverse data types, including time series, images, 3D shapes, and even analog circuit simulations, often employing novel attention mechanisms and training strategies like packing and contrastive learning to enhance performance. These advancements are significant because they enable the generation of high-quality, complex data across various domains, impacting fields ranging from image synthesis and 3D modeling to natural language processing and scientific simulation.
Papers
January 24, 2024
December 1, 2023
November 21, 2023
November 14, 2023
November 5, 2023
September 11, 2023
May 25, 2023
May 24, 2023
May 23, 2023
May 20, 2023
May 12, 2023
May 3, 2023
April 27, 2023
April 15, 2023
March 20, 2023
December 28, 2022
November 30, 2022
October 9, 2022