Transformer Based
Transformer-based models are revolutionizing various fields by leveraging self-attention mechanisms to capture long-range dependencies in sequential data, achieving state-of-the-art results in tasks ranging from natural language processing and image recognition to time series forecasting and robotic control. Current research focuses on improving efficiency (e.g., through quantization and optimized architectures), enhancing generalization capabilities, and addressing challenges like handling long sequences and endogeneity. These advancements are significantly impacting diverse scientific communities and practical applications, leading to more accurate, efficient, and robust models across numerous domains.
Papers
ResiDual: Transformer with Dual Residual Connections
Shufang Xie, Huishuai Zhang, Junliang Guo, Xu Tan, Jiang Bian, Hany Hassan Awadalla, Arul Menezes, Tao Qin, Rui Yan
3D Brainformer: 3D Fusion Transformer for Brain Tumor Segmentation
Rui Nian, Guoyao Zhang, Yao Sui, Yuqi Qian, Qiuying Li, Mingzhang Zhao, Jianhui Li, Ali Gholipour, Simon K. Warfield
Distinguishing a planetary transit from false positives: a Transformer-based classification for planetary transit signals
Helem Salinas, Karim Pichara, Rafael Brahm, Francisco Pérez-Galarce, Domingo Mery
Exploiting Inductive Bias in Transformer for Point Cloud Classification and Segmentation
Zihao Li, Pan Gao, Hui Yuan, Ran Wei, Manoranjan Paul
SoGAR: Self-supervised Spatiotemporal Attention-based Social Group Activity Recognition
Naga VS Raviteja Chappa, Pha Nguyen, Alexander H Nelson, Han-Seok Seo, Xin Li, Page Daniel Dobbs, Khoa Luu
DCN-T: Dual Context Network with Transformer for Hyperspectral Image Classification
Di Wang, Jing Zhang, Bo Du, Liangpei Zhang, Dacheng Tao
Transformer-Based Visual Segmentation: A Survey
Xiangtai Li, Henghui Ding, Haobo Yuan, Wenwei Zhang, Jiangmiao Pang, Guangliang Cheng, Kai Chen, Ziwei Liu, Chen Change Loy