Local Transformer
Local transformers represent a significant advancement in deep learning, aiming to improve the efficiency and effectiveness of transformer architectures by focusing computational resources on localized regions of input data while still capturing global context. Current research emphasizes hybrid models combining local transformer blocks with convolutional neural networks or global attention mechanisms, often within encoder-decoder frameworks, to address limitations in handling high-frequency information and long-range dependencies. These advancements are impacting various fields, including image segmentation, object tracking, and remote sensing, by enabling more accurate and efficient processing of high-dimensional data.
Papers
October 3, 2024
September 24, 2024
August 20, 2024
July 18, 2024
April 23, 2024
April 6, 2024
March 20, 2024
January 31, 2024
December 13, 2023
November 11, 2023
October 31, 2023
October 2, 2023
July 22, 2023
June 24, 2023
June 7, 2023
May 24, 2023
May 9, 2023
April 28, 2023
April 1, 2023