Transformer Based Network
Transformer-based networks are a class of deep learning models achieving state-of-the-art results across diverse applications by leveraging self-attention mechanisms to capture long-range dependencies within data. Current research focuses on improving efficiency (e.g., through pruning, lightweight architectures, and optimized attention mechanisms), enhancing explainability, and adapting transformers to specific data modalities (e.g., images, point clouds, time series). These advancements are significantly impacting fields like computer vision, natural language processing, and medical image analysis, leading to improved accuracy and efficiency in tasks ranging from object detection to medical diagnosis.
Papers
October 9, 2023
August 29, 2023
August 27, 2023
August 9, 2023
July 16, 2023
July 12, 2023
June 21, 2023
June 14, 2023
May 12, 2023
February 18, 2023
February 17, 2023
January 8, 2023
December 30, 2022
December 21, 2022
December 3, 2022
November 4, 2022
September 12, 2022
August 31, 2022