Transformer Based
Transformer-based models are revolutionizing various fields by leveraging self-attention mechanisms to capture long-range dependencies in sequential data, achieving state-of-the-art results in tasks ranging from natural language processing and image recognition to time series forecasting and robotic control. Current research focuses on improving efficiency (e.g., through quantization and optimized architectures), enhancing generalization capabilities, and addressing challenges like handling long sequences and endogeneity. These advancements are significantly impacting diverse scientific communities and practical applications, leading to more accurate, efficient, and robust models across numerous domains.
Papers
BBT-Fin: Comprehensive Construction of Chinese Financial Domain Pre-trained Language Model, Corpus and Benchmark
Dakuan Lu, Hengkui Wu, Jiaqing Liang, Yipei Xu, Qianyu He, Yipeng Geng, Mengkun Han, Yingsi Xin, Yanghua Xiao
MorphGANFormer: Transformer-based Face Morphing and De-Morphing
Na Zhang, Xudong Liu, Xin Li, Guo-Jun Qi
Hyneter: Hybrid Network Transformer for Object Detection
Dong Chen, Duoqian Miao, Xuerong Zhao
Transformadores: Fundamentos teoricos y Aplicaciones
Jordi de la Torre
\`A-la-carte Prompt Tuning (APT): Combining Distinct Data Via Composable Prompting
Benjamin Bowman, Alessandro Achille, Luca Zancato, Matthew Trager, Pramuditha Perera, Giovanni Paolini, Stefano Soatto
ForceFormer: Exploring Social Force and Transformer for Pedestrian Trajectory Prediction
Weicheng Zhang, Hao Cheng, Fatema T. Johora, Monika Sester
Energy Transformer
Benjamin Hoover, Yuchen Liang, Bao Pham, Rameswar Panda, Hendrik Strobelt, Duen Horng Chau, Mohammed J. Zaki, Dmitry Krotov
Synthesizing audio from tongue motion during speech using tagged MRI via transformer
Xiaofeng Liu, Fangxu Xing, Jerry L. Prince, Maureen Stone, Georges El Fakhri, Jonghye Woo