Transformer Based Structure
Transformer-based structures are revolutionizing various fields by leveraging attention mechanisms to model long-range dependencies in data. Current research focuses on improving efficiency (e.g., through token reduction and compact architectures like Mixers), addressing limitations such as oversmoothing and adversarial vulnerability, and adapting transformers for specific tasks (e.g., audio classification, image inpainting, and industrial prognostics). These advancements are significantly impacting diverse applications, from medical imaging and natural language processing to resource-constrained settings, by enhancing model accuracy, robustness, and efficiency.
Papers
October 16, 2024
September 26, 2024
September 3, 2024
January 20, 2024
January 16, 2024
January 9, 2024
September 23, 2023
August 12, 2023
November 19, 2022
November 2, 2022
October 23, 2022
June 28, 2022
May 23, 2022
April 7, 2022
March 2, 2022