Transformer Based Network
Transformer-based networks are a class of deep learning models achieving state-of-the-art results across diverse applications by leveraging self-attention mechanisms to capture long-range dependencies within data. Current research focuses on improving efficiency (e.g., through pruning, lightweight architectures, and optimized attention mechanisms), enhancing explainability, and adapting transformers to specific data modalities (e.g., images, point clouds, time series). These advancements are significantly impacting fields like computer vision, natural language processing, and medical image analysis, leading to improved accuracy and efficiency in tasks ranging from object detection to medical diagnosis.
Papers
October 17, 2024
October 14, 2024
October 4, 2024
September 16, 2024
August 22, 2024
August 19, 2024
July 26, 2024
June 4, 2024
March 30, 2024
March 7, 2024
February 12, 2024
February 8, 2024
December 14, 2023
November 3, 2023
October 20, 2023
October 16, 2023
October 9, 2023
August 29, 2023