Pyramid Transformer
Pyramid Transformers represent a class of deep learning models that leverage hierarchical structures to process data efficiently and effectively, particularly for long sequences or high-resolution inputs. Current research focuses on adapting this architecture to various tasks, including image segmentation, time series forecasting, and natural language processing, often incorporating it with convolutional neural networks or recurrent neural networks to enhance performance. This approach addresses limitations of traditional Transformers in terms of computational cost and scalability, leading to improved results in diverse applications such as medical imaging analysis, autonomous driving, and satellite image processing.
Papers
November 7, 2024
September 12, 2024
August 20, 2024
July 30, 2024
May 24, 2024
May 21, 2024
April 28, 2024
April 23, 2024
March 5, 2024
December 24, 2023
October 23, 2023
August 21, 2023
August 1, 2023
June 15, 2023
June 8, 2023
May 23, 2023
November 23, 2022
July 13, 2022
July 8, 2022
June 6, 2022