Transfer Learning for Sentiment Classification Using Bidirectional Encoder Representations from Transformers (BERT) Model

Sentiment is currently one of the most emerging areas of research due to the large amount of web content coming from social networking websites. Sentiment analysis is a crucial process for recommending systems for most people. Generally, the purpose of sentiment analysis is to determine an author’s...

Full description

Bibliographic Details
Main Authors: Ali Areshey, Hassan Mathkour
Format: Article
Language:English
Published: MDPI AG 2023-05-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/23/11/5232
_version_ 1797596730263339008
author Ali Areshey
Hassan Mathkour
author_facet Ali Areshey
Hassan Mathkour
author_sort Ali Areshey
collection DOAJ
description Sentiment is currently one of the most emerging areas of research due to the large amount of web content coming from social networking websites. Sentiment analysis is a crucial process for recommending systems for most people. Generally, the purpose of sentiment analysis is to determine an author’s attitude toward a subject or the overall tone of a document. There is a huge collection of studies that make an effort to predict how useful online reviews will be and have produced conflicting results on the efficacy of different methodologies. Furthermore, many of the current solutions employ manual feature generation and conventional shallow learning methods, which restrict generalization. As a result, the goal of this research is to develop a general approach using transfer learning by applying the “BERT (Bidirectional Encoder Representations from Transformers)”-based model. The efficiency of BERT classification is then evaluated by comparing it with similar machine learning techniques. In the experimental evaluation, the proposed model demonstrated superior performance in terms of outstanding prediction and high accuracy compared to earlier research. Comparative tests conducted on positive and negative Yelp reviews reveal that fine-tuned BERT classification performs better than other approaches. In addition, it is observed that BERT classifiers using batch size and sequence length significantly affect classification performance.
first_indexed 2024-03-11T02:57:09Z
format Article
id doaj.art-e34e4f55b405428c88f0a046bd04d041
institution Directory Open Access Journal
issn 1424-8220
language English
last_indexed 2024-03-11T02:57:09Z
publishDate 2023-05-01
publisher MDPI AG
record_format Article
series Sensors
spelling doaj.art-e34e4f55b405428c88f0a046bd04d0412023-11-18T08:34:21ZengMDPI AGSensors1424-82202023-05-012311523210.3390/s23115232Transfer Learning for Sentiment Classification Using Bidirectional Encoder Representations from Transformers (BERT) ModelAli Areshey0Hassan Mathkour1Department of Computer Science, College of Computer and Information Sciences, King Saud University, Riyadh 11543, Saudi ArabiaDepartment of Computer Science, College of Computer and Information Sciences, King Saud University, Riyadh 11543, Saudi ArabiaSentiment is currently one of the most emerging areas of research due to the large amount of web content coming from social networking websites. Sentiment analysis is a crucial process for recommending systems for most people. Generally, the purpose of sentiment analysis is to determine an author’s attitude toward a subject or the overall tone of a document. There is a huge collection of studies that make an effort to predict how useful online reviews will be and have produced conflicting results on the efficacy of different methodologies. Furthermore, many of the current solutions employ manual feature generation and conventional shallow learning methods, which restrict generalization. As a result, the goal of this research is to develop a general approach using transfer learning by applying the “BERT (Bidirectional Encoder Representations from Transformers)”-based model. The efficiency of BERT classification is then evaluated by comparing it with similar machine learning techniques. In the experimental evaluation, the proposed model demonstrated superior performance in terms of outstanding prediction and high accuracy compared to earlier research. Comparative tests conducted on positive and negative Yelp reviews reveal that fine-tuned BERT classification performs better than other approaches. In addition, it is observed that BERT classifiers using batch size and sequence length significantly affect classification performance.https://www.mdpi.com/1424-8220/23/11/5232BERT modelsentiment analysismachine learningtransformerstransfer learning
spellingShingle Ali Areshey
Hassan Mathkour
Transfer Learning for Sentiment Classification Using Bidirectional Encoder Representations from Transformers (BERT) Model
Sensors
BERT model
sentiment analysis
machine learning
transformers
transfer learning
title Transfer Learning for Sentiment Classification Using Bidirectional Encoder Representations from Transformers (BERT) Model
title_full Transfer Learning for Sentiment Classification Using Bidirectional Encoder Representations from Transformers (BERT) Model
title_fullStr Transfer Learning for Sentiment Classification Using Bidirectional Encoder Representations from Transformers (BERT) Model
title_full_unstemmed Transfer Learning for Sentiment Classification Using Bidirectional Encoder Representations from Transformers (BERT) Model
title_short Transfer Learning for Sentiment Classification Using Bidirectional Encoder Representations from Transformers (BERT) Model
title_sort transfer learning for sentiment classification using bidirectional encoder representations from transformers bert model
topic BERT model
sentiment analysis
machine learning
transformers
transfer learning
url https://www.mdpi.com/1424-8220/23/11/5232
work_keys_str_mv AT aliareshey transferlearningforsentimentclassificationusingbidirectionalencoderrepresentationsfromtransformersbertmodel
AT hassanmathkour transferlearningforsentimentclassificationusingbidirectionalencoderrepresentationsfromtransformersbertmodel