Unrestricted Attention May Not Be All You Need–Masked Attention Mechanism Focuses Better on Relevant Parts in Aspect-Based Sentiment Analysis

Aspect-Based Sentiment Analysis (ABSA) is one of the highly challenging tasks in natural language processing. It extracts fine-grained sentiment information in user-generated reviews, as it aims at predicting the polarities towards predefined aspect categories or relevant entities in free text. Prev...

Full description

Bibliographic Details
Main Authors: Ao Feng, Xuelei Zhang, Xinyu Song
Format: Article
Language:English
Published: IEEE 2022-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9676694/
_version_ 1819281551388573696
author Ao Feng
Xuelei Zhang
Xinyu Song
author_facet Ao Feng
Xuelei Zhang
Xinyu Song
author_sort Ao Feng
collection DOAJ
description Aspect-Based Sentiment Analysis (ABSA) is one of the highly challenging tasks in natural language processing. It extracts fine-grained sentiment information in user-generated reviews, as it aims at predicting the polarities towards predefined aspect categories or relevant entities in free text. Previous deep learning approaches usually rely on large-scale pre-trained language models and the attention mechanism, which applies the complete computed attention weights and does not place any restriction on the attention assignment. We argue that the original attention mechanism is not the ideal configuration for ABSA, as for most of the time only a small portion of terms are strongly related to the sentiment polarity of an aspect or entity. In this paper, we propose a masked attention mechanism customized for ABSA, with two different approaches to generate the mask. The first method sets an attention weight threshold that is determined by the maximum of all weights, and keeps only attention scores above the threshold. The second selects the top words with the highest weights. Both remove the lower score parts that are assumed to be less relevant to the aspect of focus. By ignoring part of input that is claimed irrelevant, a large proportion of input noise is removed, keeping the downstream model more focused and reducing calculation cost. Experiments on the Multi-Aspect Multi-Sentiment (MAMS) and SemEval-2014 datasets show significant improvements over state-of-the-art pre-trained language models with full attention, which displays the value of the masked attention mechanism. Recent work shows that simple self-attention in Transformer quickly degenerates to a rank-1 matrix, and masked attention may be another cure for that trend.
first_indexed 2024-12-24T01:01:29Z
format Article
id doaj.art-bf764c7bb99c440688e838d341792827
institution Directory Open Access Journal
issn 2169-3536
language English
last_indexed 2024-12-24T01:01:29Z
publishDate 2022-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj.art-bf764c7bb99c440688e838d3417928272022-12-21T17:23:21ZengIEEEIEEE Access2169-35362022-01-01108518852810.1109/ACCESS.2022.31421789676694Unrestricted Attention May Not Be All You Need–Masked Attention Mechanism Focuses Better on Relevant Parts in Aspect-Based Sentiment AnalysisAo Feng0https://orcid.org/0000-0001-6231-7810Xuelei Zhang1https://orcid.org/0000-0001-7927-7181Xinyu Song2https://orcid.org/0000-0001-9709-029XSchool of Computer Science, Chengdu University of Information Technology, Chengdu, ChinaSchool of Computer Science, Chengdu University of Information Technology, Chengdu, ChinaSchool of Computer Science, Chengdu University of Information Technology, Chengdu, ChinaAspect-Based Sentiment Analysis (ABSA) is one of the highly challenging tasks in natural language processing. It extracts fine-grained sentiment information in user-generated reviews, as it aims at predicting the polarities towards predefined aspect categories or relevant entities in free text. Previous deep learning approaches usually rely on large-scale pre-trained language models and the attention mechanism, which applies the complete computed attention weights and does not place any restriction on the attention assignment. We argue that the original attention mechanism is not the ideal configuration for ABSA, as for most of the time only a small portion of terms are strongly related to the sentiment polarity of an aspect or entity. In this paper, we propose a masked attention mechanism customized for ABSA, with two different approaches to generate the mask. The first method sets an attention weight threshold that is determined by the maximum of all weights, and keeps only attention scores above the threshold. The second selects the top words with the highest weights. Both remove the lower score parts that are assumed to be less relevant to the aspect of focus. By ignoring part of input that is claimed irrelevant, a large proportion of input noise is removed, keeping the downstream model more focused and reducing calculation cost. Experiments on the Multi-Aspect Multi-Sentiment (MAMS) and SemEval-2014 datasets show significant improvements over state-of-the-art pre-trained language models with full attention, which displays the value of the masked attention mechanism. Recent work shows that simple self-attention in Transformer quickly degenerates to a rank-1 matrix, and masked attention may be another cure for that trend.https://ieeexplore.ieee.org/document/9676694/Sentiment analysisattention mechanismpre-trained language modelmasked attention
spellingShingle Ao Feng
Xuelei Zhang
Xinyu Song
Unrestricted Attention May Not Be All You Need–Masked Attention Mechanism Focuses Better on Relevant Parts in Aspect-Based Sentiment Analysis
IEEE Access
Sentiment analysis
attention mechanism
pre-trained language model
masked attention
title Unrestricted Attention May Not Be All You Need–Masked Attention Mechanism Focuses Better on Relevant Parts in Aspect-Based Sentiment Analysis
title_full Unrestricted Attention May Not Be All You Need–Masked Attention Mechanism Focuses Better on Relevant Parts in Aspect-Based Sentiment Analysis
title_fullStr Unrestricted Attention May Not Be All You Need–Masked Attention Mechanism Focuses Better on Relevant Parts in Aspect-Based Sentiment Analysis
title_full_unstemmed Unrestricted Attention May Not Be All You Need–Masked Attention Mechanism Focuses Better on Relevant Parts in Aspect-Based Sentiment Analysis
title_short Unrestricted Attention May Not Be All You Need–Masked Attention Mechanism Focuses Better on Relevant Parts in Aspect-Based Sentiment Analysis
title_sort unrestricted attention may not be all you need x2013 masked attention mechanism focuses better on relevant parts in aspect based sentiment analysis
topic Sentiment analysis
attention mechanism
pre-trained language model
masked attention
url https://ieeexplore.ieee.org/document/9676694/
work_keys_str_mv AT aofeng unrestrictedattentionmaynotbeallyouneedx2013maskedattentionmechanismfocusesbetteronrelevantpartsinaspectbasedsentimentanalysis
AT xueleizhang unrestrictedattentionmaynotbeallyouneedx2013maskedattentionmechanismfocusesbetteronrelevantpartsinaspectbasedsentimentanalysis
AT xinyusong unrestrictedattentionmaynotbeallyouneedx2013maskedattentionmechanismfocusesbetteronrelevantpartsinaspectbasedsentimentanalysis