Frequency Self Attention
Frequency self-attention mechanisms leverage the spectral properties of data (like images or audio) to improve the performance of deep learning models. Current research focuses on integrating frequency-aware self-attention into various architectures, including transformers and convolutional neural networks, often incorporating techniques like fractional Fourier transforms to handle non-stationary signals. This approach enhances feature extraction and information utilization, leading to improved results in diverse applications such as image deblurring, super-resolution, anomaly detection, and semantic segmentation, often with reduced computational cost compared to traditional self-attention.
Papers
September 3, 2024
March 12, 2024
August 27, 2023
November 28, 2022