Transformer Based Architecture
Transformer-based architectures, initially developed for natural language processing, are rapidly expanding into diverse fields like computer vision and robotics, aiming to improve efficiency and accuracy in various tasks. Current research focuses on optimizing transformer models for specific applications, including developing more efficient attention mechanisms (e.g., FlashAttention), exploring alternative architectures like state-space models, and adapting transformers for resource-constrained environments. This surge in transformer applications is significantly impacting various scientific domains and practical applications, leading to advancements in areas such as image captioning, anomaly detection, and medical image analysis.
Papers
TopoMask: Instance-Mask-Based Formulation for the Road Topology Problem via Transformer-Based Architecture
M. Esat Kalfaoglu, Halil Ibrahim Ozturk, Ozsel Kilinc, Alptekin Temizel
Extensive Evaluation of Transformer-based Architectures for Adverse Drug Events Extraction
Simone Scaboro, Beatrice Portellia, Emmanuele Chersoni, Enrico Santus, Giuseppe Serra