Time Frequency Attention
Time-frequency attention mechanisms enhance signal processing by weighting the importance of different time and frequency components within a signal's representation, improving the accuracy of tasks like speech enhancement and separation. Current research focuses on integrating these mechanisms into convolutional neural networks (CNNs) and transformers, often employing variations of self-attention to capture both local and global relationships within the time-frequency domain. This approach leads to improved performance in various applications, including speech recognition, video super-resolution, and modulation recognition, by allowing models to selectively focus on the most relevant information for a given task.
Papers
January 7, 2024
July 28, 2023
June 15, 2023
December 27, 2022
October 22, 2022
November 15, 2021