Target-Dependent Sentiment Classification With BERT
Research on machine assisted text analysis follows the rapid development of digital media, and sentiment analysis is among the prevalent applications. Traditional sentiment analysis methods require complex feature engineering, and embedding representations have dominated leaderboards for a long time...
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2019-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/8864964/ |
_version_ | 1818330141537861632 |
---|---|
author | Zhengjie Gao Ao Feng Xinyu Song Xi Wu |
author_facet | Zhengjie Gao Ao Feng Xinyu Song Xi Wu |
author_sort | Zhengjie Gao |
collection | DOAJ |
description | Research on machine assisted text analysis follows the rapid development of digital media, and sentiment analysis is among the prevalent applications. Traditional sentiment analysis methods require complex feature engineering, and embedding representations have dominated leaderboards for a long time. However, the context-independent nature limits their representative power in rich context, hurting performance in Natural Language Processing (NLP) tasks. Bidirectional Encoder Representations from Transformers (BERT), among other pre-trained language models, beats existing best results in eleven NLP tasks (including sentence-level sentiment classification) by a large margin, which makes it the new baseline of text representation. As a more challenging task, fewer applications of BERT have been observed for sentiment classification at the aspect level. We implement three target-dependent variations of the BERT<sub>base</sub> model, with positioned output at the target terms and an optional sentence with the target built in. Experiments on three data collections show that our TD-BERT model achieves new state-of-the-art performance, in comparison to traditional feature engineering methods, embedding-based models and earlier applications of BERT. With the successful application of BERT in many NLP tasks, our experiments try to verify if its context-aware representation can achieve similar performance improvement in aspect-based sentiment analysis. Surprisingly, coupling it with complex neural networks that used to work well with embedding representations does not show much value, sometimes with performance below the vanilla BERT-FC implementation. On the other hand, incorporation of target information shows stable accuracy improvement, and the most effective way of utilizing that information is displayed through the experiment. |
first_indexed | 2024-12-13T12:59:14Z |
format | Article |
id | doaj.art-63521e06357543a290717dd56689f5c4 |
institution | Directory Open Access Journal |
issn | 2169-3536 |
language | English |
last_indexed | 2024-12-13T12:59:14Z |
publishDate | 2019-01-01 |
publisher | IEEE |
record_format | Article |
series | IEEE Access |
spelling | doaj.art-63521e06357543a290717dd56689f5c42022-12-21T23:45:04ZengIEEEIEEE Access2169-35362019-01-01715429015429910.1109/ACCESS.2019.29465948864964Target-Dependent Sentiment Classification With BERTZhengjie Gao0https://orcid.org/0000-0003-0686-4611Ao Feng1https://orcid.org/0000-0001-6231-7810Xinyu Song2Xi Wu3https://orcid.org/0000-0002-7659-1631Department of Computer Science, Chengdu University of Information Technology, Chengdu, ChinaDepartment of Computer Science, Chengdu University of Information Technology, Chengdu, ChinaDepartment of Computer Science, Chengdu University of Information Technology, Chengdu, ChinaDepartment of Computer Science, Chengdu University of Information Technology, Chengdu, ChinaResearch on machine assisted text analysis follows the rapid development of digital media, and sentiment analysis is among the prevalent applications. Traditional sentiment analysis methods require complex feature engineering, and embedding representations have dominated leaderboards for a long time. However, the context-independent nature limits their representative power in rich context, hurting performance in Natural Language Processing (NLP) tasks. Bidirectional Encoder Representations from Transformers (BERT), among other pre-trained language models, beats existing best results in eleven NLP tasks (including sentence-level sentiment classification) by a large margin, which makes it the new baseline of text representation. As a more challenging task, fewer applications of BERT have been observed for sentiment classification at the aspect level. We implement three target-dependent variations of the BERT<sub>base</sub> model, with positioned output at the target terms and an optional sentence with the target built in. Experiments on three data collections show that our TD-BERT model achieves new state-of-the-art performance, in comparison to traditional feature engineering methods, embedding-based models and earlier applications of BERT. With the successful application of BERT in many NLP tasks, our experiments try to verify if its context-aware representation can achieve similar performance improvement in aspect-based sentiment analysis. Surprisingly, coupling it with complex neural networks that used to work well with embedding representations does not show much value, sometimes with performance below the vanilla BERT-FC implementation. On the other hand, incorporation of target information shows stable accuracy improvement, and the most effective way of utilizing that information is displayed through the experiment.https://ieeexplore.ieee.org/document/8864964/Deep learningneural networkssentiment analysisBERT |
spellingShingle | Zhengjie Gao Ao Feng Xinyu Song Xi Wu Target-Dependent Sentiment Classification With BERT IEEE Access Deep learning neural networks sentiment analysis BERT |
title | Target-Dependent Sentiment Classification With BERT |
title_full | Target-Dependent Sentiment Classification With BERT |
title_fullStr | Target-Dependent Sentiment Classification With BERT |
title_full_unstemmed | Target-Dependent Sentiment Classification With BERT |
title_short | Target-Dependent Sentiment Classification With BERT |
title_sort | target dependent sentiment classification with bert |
topic | Deep learning neural networks sentiment analysis BERT |
url | https://ieeexplore.ieee.org/document/8864964/ |
work_keys_str_mv | AT zhengjiegao targetdependentsentimentclassificationwithbert AT aofeng targetdependentsentimentclassificationwithbert AT xinyusong targetdependentsentimentclassificationwithbert AT xiwu targetdependentsentimentclassificationwithbert |