Local Transformer
Local transformers represent a significant advancement in deep learning, aiming to improve the efficiency and effectiveness of transformer architectures by focusing computational resources on localized regions of input data while still capturing global context. Current research emphasizes hybrid models combining local transformer blocks with convolutional neural networks or global attention mechanisms, often within encoder-decoder frameworks, to address limitations in handling high-frequency information and long-range dependencies. These advancements are impacting various fields, including image segmentation, object tracking, and remote sensing, by enabling more accurate and efficient processing of high-dimensional data.
Papers
March 15, 2023
February 25, 2023
November 27, 2022
November 20, 2022
August 31, 2022
August 8, 2022
August 1, 2022
July 5, 2022
June 21, 2022
April 14, 2022
March 29, 2022
March 17, 2022
January 30, 2022
December 10, 2021