Autoregressive Model
Autoregressive models are a class of generative models that predict sequential data by probabilistically modeling the next element based on preceding ones. Current research focuses on enhancing their controllability for tasks like image and 3D shape generation, improving their efficiency through parallel processing and non-autoregressive techniques, and applying them to diverse domains including time series forecasting, speech recognition, and even scientific modeling of physical systems. These advancements are driving improvements in various applications, from more accurate weather forecasting and efficient speech recognition to novel approaches in drug discovery and materials science.
Papers
SutraNets: Sub-series Autoregressive Networks for Long-Sequence, Probabilistic Forecasting
Shane Bergsma, Timothy Zeyl, Lei Guo
Emage: Non-Autoregressive Text-to-Image Generation
Zhangyin Feng, Runyi Hu, Liangxin Liu, Fan Zhang, Duyu Tang, Yong Dai, Xiaocheng Feng, Jiwei Li, Bing Qin, Shuming Shi
Generative Pretraining at Scale: Transformer-Based Encoding of Transactional Behavior for Fraud Detection
Ze Yu Zhao, Zheng Zhu, Guilin Li, Wenhan Wang, Bo Wang
An Efficient Imbalance-Aware Federated Learning Approach for Wearable Healthcare with Autoregressive Ratio Observation
Wenhao Yan, He Li, Kaoru Ota, Mianxiong Dong
Meaning Representations from Trajectories in Autoregressive Models
Tian Yu Liu, Matthew Trager, Alessandro Achille, Pramuditha Perera, Luca Zancato, Stefano Soatto