Transformer Based
Transformer-based models are revolutionizing various fields by leveraging self-attention mechanisms to capture long-range dependencies in sequential data, achieving state-of-the-art results in tasks ranging from natural language processing and image recognition to time series forecasting and robotic control. Current research focuses on improving efficiency (e.g., through quantization and optimized architectures), enhancing generalization capabilities, and addressing challenges like handling long sequences and endogeneity. These advancements are significantly impacting diverse scientific communities and practical applications, leading to more accurate, efficient, and robust models across numerous domains.
Papers
Supra-Laplacian Encoding for Transformer on Dynamic Graphs
Yannis Karmim, Marc Lafon, Raphaël Fournier S'niehotta, Nicolas Thome
Ophthalmic Biomarker Detection with Parallel Prediction of Transformer and Convolutional Architecture
Md. Touhidul Islam, Md. Abtahi Majeed Chowdhury, Mahmudul Hasan, Asif Quadir, Lutfa Aktar
General Compression Framework for Efficient Transformer Object Tracking
Lingyi Hong, Jinglun Li, Xinyu Zhou, Shilin Yan, Pinxue Guo, Kaixun Jiang, Zhaoyu Chen, Shuyong Gao, Wei Zhang, Hong Lu, Wenqiang Zhang
A Novel Spinor-Based Embedding Model for Transformers
Rick White
MonoFormer: One Transformer for Both Diffusion and Autoregression
Chuyang Zhao, Yuxing Song, Wenhao Wang, Haocheng Feng, Errui Ding, Yifan Sun, Xinyan Xiao, Jingdong Wang
Transformer based time series prediction of the maximum power point for solar photovoltaic cells
Palaash Agrawal, Hari Om Bansal, Aditya R. Gautam, Om Prakash Mahela, Baseem Khan
One-shot World Models Using a Transformer Trained on a Synthetic Prior
Fabio Ferreira, Moreno Schlageter, Raghu Rajan, Andre Biedenkapp, Frank Hutter
FAMOUS: Flexible Accelerator for the Attention Mechanism of Transformer on UltraScale+ FPGAs
Ehsan Kabir, Md. Arafat Kabir, Austin R.J. Downey, Jason D. Bakos, David Andrews, Miaoqing Huang
ProTEA: Programmable Transformer Encoder Acceleration on FPGA
Ehsan Kabir, Jason D. Bakos, David Andrews, Miaoqing Huang
Axial Attention Transformer Networks: A New Frontier in Breast Cancer Detection
Weijie He, Runyuan Bao, Yiru Cang, Jianjun Wei, Yang Zhang, Jiacheng Hu
RoboMorph: In-Context Meta-Learning for Robot Dynamics Modeling
Manuel Bianchi Bazzi, Asad Ali Shahid, Christopher Agia, John Alora, Marco Forgione, Dario Piga, Francesco Braghin, Marco Pavone, Loris Roveda
TTT-Unet: Enhancing U-Net with Test-Time Training Layers for biomedical image segmentation
Rong Zhou, Zhengqing Yuan, Zhiling Yan, Weixiang Sun, Kai Zhang, Yiwei Li, Yanfang Ye, Xiang Li, Lifang He, Lichao Sun
American Sign Language to Text Translation using Transformer and Seq2Seq with LSTM
Gregorius Guntur Sunardi Putra, Adifa Widyadhani Chanda D'Layla, Dimas Wahono, Riyanarto Sarno, Agus Tri Haryono