Transformer Megatron Decepticons
Transformer models are being extensively investigated for various sequence processing tasks, moving beyond natural language processing to encompass time series forecasting, image recognition, and scientific computing applications like solving partial differential equations. Current research focuses on improving efficiency (e.g., through mixed-precision quantization and optimized architectures), enhancing generalization capabilities (particularly to longer sequences), and understanding the underlying mechanisms of in-context learning. These advancements have significant implications for diverse fields, improving the accuracy and efficiency of numerous applications while simultaneously deepening our theoretical understanding of these powerful models.
Papers
TexIm FAST: Text-to-Image Representation for Semantic Similarity Evaluation using Transformers
Wazib Ansar, Saptarsi Goswami, Amlan Chakrabarti
Transformers need glasses! Information over-squashing in language tasks
Federico Barbero, Andrea Banino, Steven Kapturowski, Dharshan Kumaran, João G.M. Araújo, Alex Vitvitskyi, Razvan Pascanu, Petar Veličković
Stable-Pose: Leveraging Transformers for Pose-Guided Text-to-Image Generation
Jiajun Wang, Morteza Ghahremani, Yitong Li, Björn Ommer, Christian Wachinger
Modeling Emotional Trajectories in Written Stories Utilizing Transformers and Weakly-Supervised Learning
Lukas Christ, Shahin Amiriparian, Manuel Milling, Ilhan Aslan, Björn W. Schuller
Direct Cardiac Segmentation from Undersampled K-space Using Transformers
Yundi Zhang, Nil Stolt-Ansó, Jiazhen Pan, Wenqi Huang, Kerstin Hammernik, Daniel Rueckert
Transformers are SSMs: Generalized Models and Efficient Algorithms Through Structured State Space Duality
Tri Dao, Albert Gu
Learning to Estimate System Specifications in Linear Temporal Logic using Transformers and Mamba
İlker Işık, Ebru Aydin Gol, Ramazan Gokberk Cinbis
Position Coupling: Improving Length Generalization of Arithmetic Transformers Using Task Structure
Hanseul Cho, Jaeyoung Cha, Pranjal Awasthi, Srinadh Bhojanapalli, Anupam Gupta, Chulhee Yun
Contextual Counting: A Mechanistic Study of Transformers on a Quantitative Task
Siavash Golkar, Alberto Bietti, Mariel Pettee, Michael Eickenberg, Miles Cranmer, Keiya Hirashima, Geraud Krawezik, Nicholas Lourie, Michael McCabe, Rudy Morel, Ruben Ohana, Liam Holden Parker, Bruno Régaldo-Saint Blancard, Kyunghyun Cho, Shirley Ho
OmniHands: Towards Robust 4D Hand Mesh Recovery via A Versatile Transformer
Dixuan Lin, Yuxiang Zhang, Mengcheng Li, Yebin Liu, Wei Jing, Qi Yan, Qianying Wang, Hongwen Zhang
Transformers and Slot Encoding for Sample Efficient Physical World Modelling
Francesco Petri, Luigi Asprino, Aldo Gangemi
Literature Filtering for Systematic Reviews with Transformers
John Hawkins, David Tivey