Transformer Based
Transformer-based models are revolutionizing various fields by leveraging self-attention mechanisms to capture long-range dependencies in sequential data, achieving state-of-the-art results in tasks ranging from natural language processing and image recognition to time series forecasting and robotic control. Current research focuses on improving efficiency (e.g., through quantization and optimized architectures), enhancing generalization capabilities, and addressing challenges like handling long sequences and endogeneity. These advancements are significantly impacting diverse scientific communities and practical applications, leading to more accurate, efficient, and robust models across numerous domains.
Papers
TransFA: Transformer-based Representation for Face Attribute Evaluation
Decheng Liu, Weijie He, Chunlei Peng, Nannan Wang, Jie Li, Xinbo Gao
UniNet: Unified Architecture Search with Convolution, Transformer, and MLP
Jihao Liu, Xin Huang, Guanglu Song, Hongsheng Li, Yu Liu
Cross-Architecture Knowledge Distillation
Yufan Liu, Jiajiong Cao, Bing Li, Weiming Hu, Jingting Ding, Liang Li
Transformer based Models for Unsupervised Anomaly Segmentation in Brain MR Images
Ahmed Ghorbel, Ahmed Aldahdooh, Shadi Albarqouni, Wassim Hamidouche
FishFormer: Annulus Slicing-based Transformer for Fisheye Rectification with Efficacy Domain Exploration
Shangrong Yang, Chunyu Lin, Kang Liao, Yao Zhao
TabPFN: A Transformer That Solves Small Tabular Classification Problems in a Second
Noah Hollmann, Samuel Müller, Katharina Eggensperger, Frank Hutter
Multimodal Frame-Scoring Transformer for Video Summarization
Jeiyoon Park, Kiho Kwoun, Chanhee Lee, Heuiseok Lim