Autoregressive Model
Autoregressive models are a class of generative models that predict sequential data by probabilistically modeling the next element based on preceding ones. Current research focuses on enhancing their controllability for tasks like image and 3D shape generation, improving their efficiency through parallel processing and non-autoregressive techniques, and applying them to diverse domains including time series forecasting, speech recognition, and even scientific modeling of physical systems. These advancements are driving improvements in various applications, from more accurate weather forecasting and efficient speech recognition to novel approaches in drug discovery and materials science.
Papers
ControlAR: Controllable Image Generation with Autoregressive Models
Zongming Li, Tianheng Cheng, Shoufa Chen, Peize Sun, Haocheng Shen, Longjin Ran, Xiaoxin Chen, Wenyu Liu, Xinggang Wang
Three-in-One: Fast and Accurate Transducer for Hybrid-Autoregressive ASR
Hainan Xu, Travis M. Bartley, Vladimir Bataev, Boris Ginsburg
A Spark of Vision-Language Intelligence: 2-Dimensional Autoregressive Transformer for Efficient Finegrained Image Generation
Liang Chen, Sinan Tan, Zefan Cai, Weichu Xie, Haozhe Zhao, Yichi Zhang, Junyang Lin, Jinze Bai, Tianyu Liu, Baobao Chang
When a language model is optimized for reasoning, does it still show embers of autoregression? An analysis of OpenAI o1
R. Thomas McCoy, Shunyu Yao, Dan Friedman, Mathew D. Hardy, Thomas L. Griffiths