Sentiment Analysis Using Pre-Trained Language Model With No Fine-Tuning and Less Resource

Sentiment analysis has become popular when Natural Language Processing algorithms were proven to be able to process complex sentences with good accuracy. Recently, pre-trained language models such as BERT and mBERT, have been shown to be effective for improving language tasks. Most of the work in im...

Full description

Bibliographic Details
Main Authors: Yuheng Kit, Musa Mohd Mokji
Format: Article
Language:English
Published: IEEE 2022-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9912410/
_version_ 1811182336549584896
author Yuheng Kit
Musa Mohd Mokji
author_facet Yuheng Kit
Musa Mohd Mokji
author_sort Yuheng Kit
collection DOAJ
description Sentiment analysis has become popular when Natural Language Processing algorithms were proven to be able to process complex sentences with good accuracy. Recently, pre-trained language models such as BERT and mBERT, have been shown to be effective for improving language tasks. Most of the work in implementing the models focuses on fine-tuning BERT to achieve desirable results. However, this approach is resource-intensive and requires a long training time, up to a few hours on a GPU, depending on the dataset. Hence, this paper proposes a less complex system with less training time using the BERT model without the fine-tuning process and adopting a feature reduction algorithm to reduce sentence embeddings. The experimental results show that with 50% fewer sentence embeddings, the proposed system improves the accuracy by 1-2% with 71% less training time and 89% less memory usage. The proposed approach has also been proven to work for multilingual tasks by using a single mBERT model.
first_indexed 2024-04-11T09:30:49Z
format Article
id doaj.art-07c91a3ad7d04f86b61512071538bcad
institution Directory Open Access Journal
issn 2169-3536
language English
last_indexed 2024-04-11T09:30:49Z
publishDate 2022-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj.art-07c91a3ad7d04f86b61512071538bcad2022-12-22T04:31:53ZengIEEEIEEE Access2169-35362022-01-011010705610706510.1109/ACCESS.2022.32123679912410Sentiment Analysis Using Pre-Trained Language Model With No Fine-Tuning and Less ResourceYuheng Kit0https://orcid.org/0000-0002-3114-4408Musa Mohd Mokji1https://orcid.org/0000-0002-7590-8481Faculty of Engineering, School of Electrical Engineering, Universiti Teknologi Malaysia, Johor Bahru, MalaysiaFaculty of Engineering, School of Electrical Engineering, Universiti Teknologi Malaysia, Johor Bahru, MalaysiaSentiment analysis has become popular when Natural Language Processing algorithms were proven to be able to process complex sentences with good accuracy. Recently, pre-trained language models such as BERT and mBERT, have been shown to be effective for improving language tasks. Most of the work in implementing the models focuses on fine-tuning BERT to achieve desirable results. However, this approach is resource-intensive and requires a long training time, up to a few hours on a GPU, depending on the dataset. Hence, this paper proposes a less complex system with less training time using the BERT model without the fine-tuning process and adopting a feature reduction algorithm to reduce sentence embeddings. The experimental results show that with 50% fewer sentence embeddings, the proposed system improves the accuracy by 1-2% with 71% less training time and 89% less memory usage. The proposed approach has also been proven to work for multilingual tasks by using a single mBERT model.https://ieeexplore.ieee.org/document/9912410/Sentiment analysisnatural language processing
spellingShingle Yuheng Kit
Musa Mohd Mokji
Sentiment Analysis Using Pre-Trained Language Model With No Fine-Tuning and Less Resource
IEEE Access
Sentiment analysis
natural language processing
title Sentiment Analysis Using Pre-Trained Language Model With No Fine-Tuning and Less Resource
title_full Sentiment Analysis Using Pre-Trained Language Model With No Fine-Tuning and Less Resource
title_fullStr Sentiment Analysis Using Pre-Trained Language Model With No Fine-Tuning and Less Resource
title_full_unstemmed Sentiment Analysis Using Pre-Trained Language Model With No Fine-Tuning and Less Resource
title_short Sentiment Analysis Using Pre-Trained Language Model With No Fine-Tuning and Less Resource
title_sort sentiment analysis using pre trained language model with no fine tuning and less resource
topic Sentiment analysis
natural language processing
url https://ieeexplore.ieee.org/document/9912410/
work_keys_str_mv AT yuhengkit sentimentanalysisusingpretrainedlanguagemodelwithnofinetuningandlessresource
AT musamohdmokji sentimentanalysisusingpretrainedlanguagemodelwithnofinetuningandlessresource