Transformer Based
Transformer-based models are revolutionizing various fields by leveraging self-attention mechanisms to capture long-range dependencies in sequential data, achieving state-of-the-art results in tasks ranging from natural language processing and image recognition to time series forecasting and robotic control. Current research focuses on improving efficiency (e.g., through quantization and optimized architectures), enhancing generalization capabilities, and addressing challenges like handling long sequences and endogeneity. These advancements are significantly impacting diverse scientific communities and practical applications, leading to more accurate, efficient, and robust models across numerous domains.
Papers
Enhancing Robot Route Optimization in Smart Logistics with Transformer and GNN Integration
Hao Luo, Jianjun Wei, Shuchen Zhao, Ankai Liang, Zhongjin Xu, Ruxue Jiang
Sensorformer: Cross-patch attention with global-patch compression is effective for high-dimensional multivariate time series forecasting
Liyang Qin, Xiaoli Wang, Chunhua Yang, Huaiwen Zou, Haochuan Zhang
Transformer-Driven Inverse Problem Transform for Fast Blind Hyperspectral Image Dehazing
Po-Wei Tang, Chia-Hsiang Lin, Yangrui Liu
VidFormer: A novel end-to-end framework fused by 3DCNN and Transformer for Video-based Remote Physiological Measurement
Jiachen Li, Shisheng Guo, Longzhen Tang, Cuolong Cui, Lingjiang Kong, Xiaobo Yang
Why Are Positional Encodings Nonessential for Deep Autoregressive Transformers? Revisiting a Petroglyph
Kazuki Irie
STARFormer: A Novel Spatio-Temporal Aggregation Reorganization Transformer of FMRI for Brain Disorder Diagnosis
Wenhao Dong, Yueyang Li, Weiming Zeng, Lei Chen, Hongjie Yan, Wai Ting Siok, Nizhuan Wang
A Full Transformer-based Framework for Automatic Pain Estimation using Videos
Stefanos Gkikas, Manolis Tsiknakis
PCA-Featured Transformer for Jamming Detection in 5G UAV Networks
Joseanne Viana, Hamed Farkhari, Pedro Sebastiao, Victor P Gil Jimenez, Lester Ho
GFormer: Accelerating Large Language Models with Optimized Transformers on Gaudi Processors
Chengming Zhang, Xinheng Ding, Baixi Sun, Xiaodong Yu, Weijian Zheng, Zhen Xie, Dingwen Tao
Transformer models are gauge invariant: A mathematical connection between AI and particle physics
Leo van Nierop
CAE-T: A Channelwise AutoEncoder with Transformer for EEG Abnormality Detection
Youshen Zhao, Keiji Iramina