Paper ID: 2405.13013

Amplifying Aspect-Sentence Awareness: A Novel Approach for Aspect-Based Sentiment Analysis

Adamu Lawan, Juhua Pu, Haruna Yunusa, Jawad Muhammad, Aliyu Umar

Aspect-Based Sentiment Analysis (ABSA) is increasingly crucial in Natural Language Processing (NLP) for applications such as customer feedback analysis and product recommendation systems. ABSA goes beyond traditional sentiment analysis by extracting sentiments related to specific aspects mentioned in the text; existing attention-based models often need help to effectively connect aspects with context due to language complexity and multiple sentiment polarities in a single sentence. Recent research underscores the value of integrating syntactic information, such as dependency trees, to understand long-range syntactic relationships better and link aspects with context. Despite these advantages, challenges persist, including sensitivity to parsing errors and increased computational complexity when combining syntactic and semantic information. To address these issues, we propose Amplifying Aspect-Sentence Awareness (A3SN), a novel technique designed to enhance ABSA through amplifying aspect-sentence awareness attention. Following the transformer's standard process, our innovative approach incorporates multi-head attention mechanisms to augment the model with sentence and aspect semantic information. We added another multi-head attention module: amplify aspect-sentence awareness attention. By doubling its focus between the sentence and aspect, we effectively highlighted aspect importance within the sentence context. This enables accurate capture of subtle relationships and dependencies. Additionally, gated fusion integrates feature representations from multi-head and amplified aspect-sentence awareness attention mechanisms, which is essential for ABSA. Experimental results across three benchmark datasets demonstrate A3SN's effectiveness and outperform state-of-the-art (SOTA) baseline models.

Submitted: May 14, 2024