Full Transformer
Full Transformer networks are emerging as powerful alternatives to convolutional neural networks (CNNs) for various tasks, aiming to leverage the transformer's ability to model long-range dependencies and global context. Current research focuses on improving efficiency and scalability for longer sequences, exploring novel architectures like bi-directional transformers and incorporating mechanisms to enhance feature extraction and robustness to noise. These advancements are significantly impacting fields like computer vision (object tracking, image registration), natural language processing, and remote sensing, enabling improved performance in tasks previously dominated by CNN-based approaches.
Papers
February 23, 2024
February 19, 2024
September 18, 2023
August 10, 2023
February 23, 2023
November 9, 2022
October 3, 2022
June 15, 2022