Attention-Based CNN and Bi-LSTM Model Based on TF-IDF and GloVe Word Embedding for Sentiment Analysis

Sentiment analysis (SA) detects people’s opinions from text engaging natural language processing (NLP) techniques. Recent research has shown that deep learning models, i.e., Convolutional Neural Network (CNN), Recurrent Neural Network (RNN), and Transformer-based provide promising results for recogn...

Full description

Bibliographic Details
Main Authors: Marjan Kamyab, Guohua Liu, Michael Adjeisah
Format: Article
Language:English
Published: MDPI AG 2021-11-01
Series:Applied Sciences
Subjects:
Online Access:https://www.mdpi.com/2076-3417/11/23/11255
_version_ 1797508113487626240
author Marjan Kamyab
Guohua Liu
Michael Adjeisah
author_facet Marjan Kamyab
Guohua Liu
Michael Adjeisah
author_sort Marjan Kamyab
collection DOAJ
description Sentiment analysis (SA) detects people’s opinions from text engaging natural language processing (NLP) techniques. Recent research has shown that deep learning models, i.e., Convolutional Neural Network (CNN), Recurrent Neural Network (RNN), and Transformer-based provide promising results for recognizing sentiment. Nonetheless, CNN has the advantage of extracting high-level features by using convolutional and max-pooling layers; it cannot efficiently learn a sequence of correlations. At the same time, Bidirectional RNN uses two RNN directions to improve extracting long-term dependencies. However, it cannot extract local features in parallel, and Transformer-based like Bidirectional Encoder Representations from Transformers (BERT) are the computational resources needed to fine-tune, facing an overfitting problem on small datasets. This paper proposes a novel attention-based model that utilizes CNNs with LSTM (named ACL-SA). First, it applies a preprocessor to enhance the data quality and employ term frequency-inverse document frequency (TF-IDF) feature weighting and pre-trained Glove word embedding approaches to extract meaningful information from textual data. In addition, it utilizes CNN’s max-pooling to extract contextual features and reduce feature dimensionality. Moreover, it uses an integrated bidirectional LSTM to capture long-term dependencies. Furthermore, it applies the attention mechanism at the CNN’s output layer to emphasize each word’s attention level. To avoid overfitting, the Guasiannoise and GuasianDroupout are adopted as regularization. The model’s robustness is evaluated on four English standard datasets, i.e., Sentiment140, US-airline, Sentiment140-MV, SA4A with various performance matrices, and compared efficiency with existing baseline models and approaches. The experiment results show that the proposed method significantly outperforms the state-of-the-art models.
first_indexed 2024-03-10T04:57:43Z
format Article
id doaj.art-e89d0b33a41241c795734418e06656bf
institution Directory Open Access Journal
issn 2076-3417
language English
last_indexed 2024-03-10T04:57:43Z
publishDate 2021-11-01
publisher MDPI AG
record_format Article
series Applied Sciences
spelling doaj.art-e89d0b33a41241c795734418e06656bf2023-11-23T02:05:05ZengMDPI AGApplied Sciences2076-34172021-11-0111231125510.3390/app112311255Attention-Based CNN and Bi-LSTM Model Based on TF-IDF and GloVe Word Embedding for Sentiment AnalysisMarjan Kamyab0Guohua Liu1Michael Adjeisah2School of Computer Science and Technology, Donghua University, Shanghai 201620, ChinaSchool of Computer Science and Technology, Donghua University, Shanghai 201620, ChinaCollege of Mathematics and Computer Science, Zhejiang Normal University, Jinhua 321004, ChinaSentiment analysis (SA) detects people’s opinions from text engaging natural language processing (NLP) techniques. Recent research has shown that deep learning models, i.e., Convolutional Neural Network (CNN), Recurrent Neural Network (RNN), and Transformer-based provide promising results for recognizing sentiment. Nonetheless, CNN has the advantage of extracting high-level features by using convolutional and max-pooling layers; it cannot efficiently learn a sequence of correlations. At the same time, Bidirectional RNN uses two RNN directions to improve extracting long-term dependencies. However, it cannot extract local features in parallel, and Transformer-based like Bidirectional Encoder Representations from Transformers (BERT) are the computational resources needed to fine-tune, facing an overfitting problem on small datasets. This paper proposes a novel attention-based model that utilizes CNNs with LSTM (named ACL-SA). First, it applies a preprocessor to enhance the data quality and employ term frequency-inverse document frequency (TF-IDF) feature weighting and pre-trained Glove word embedding approaches to extract meaningful information from textual data. In addition, it utilizes CNN’s max-pooling to extract contextual features and reduce feature dimensionality. Moreover, it uses an integrated bidirectional LSTM to capture long-term dependencies. Furthermore, it applies the attention mechanism at the CNN’s output layer to emphasize each word’s attention level. To avoid overfitting, the Guasiannoise and GuasianDroupout are adopted as regularization. The model’s robustness is evaluated on four English standard datasets, i.e., Sentiment140, US-airline, Sentiment140-MV, SA4A with various performance matrices, and compared efficiency with existing baseline models and approaches. The experiment results show that the proposed method significantly outperforms the state-of-the-art models.https://www.mdpi.com/2076-3417/11/23/11255deep learningCNNBi-LSTMattention mechanismsocial media sentiment analysisTF-IDF
spellingShingle Marjan Kamyab
Guohua Liu
Michael Adjeisah
Attention-Based CNN and Bi-LSTM Model Based on TF-IDF and GloVe Word Embedding for Sentiment Analysis
Applied Sciences
deep learning
CNN
Bi-LSTM
attention mechanism
social media sentiment analysis
TF-IDF
title Attention-Based CNN and Bi-LSTM Model Based on TF-IDF and GloVe Word Embedding for Sentiment Analysis
title_full Attention-Based CNN and Bi-LSTM Model Based on TF-IDF and GloVe Word Embedding for Sentiment Analysis
title_fullStr Attention-Based CNN and Bi-LSTM Model Based on TF-IDF and GloVe Word Embedding for Sentiment Analysis
title_full_unstemmed Attention-Based CNN and Bi-LSTM Model Based on TF-IDF and GloVe Word Embedding for Sentiment Analysis
title_short Attention-Based CNN and Bi-LSTM Model Based on TF-IDF and GloVe Word Embedding for Sentiment Analysis
title_sort attention based cnn and bi lstm model based on tf idf and glove word embedding for sentiment analysis
topic deep learning
CNN
Bi-LSTM
attention mechanism
social media sentiment analysis
TF-IDF
url https://www.mdpi.com/2076-3417/11/23/11255
work_keys_str_mv AT marjankamyab attentionbasedcnnandbilstmmodelbasedontfidfandglovewordembeddingforsentimentanalysis
AT guohualiu attentionbasedcnnandbilstmmodelbasedontfidfandglovewordembeddingforsentimentanalysis
AT michaeladjeisah attentionbasedcnnandbilstmmodelbasedontfidfandglovewordembeddingforsentimentanalysis