A Sequential Graph Neural Network for Short Text Classification
Short text classification is an important problem of natural language processing (NLP), and graph neural networks (GNNs) have been successfully used to solve different NLP problems. However, few studies employ GNN for short text classification, and most of the existing graph-based models ignore sequ...
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2021-12-01
|
Series: | Algorithms |
Subjects: | |
Online Access: | https://www.mdpi.com/1999-4893/14/12/352 |
_version_ | 1797507031819616256 |
---|---|
author | Ke Zhao Lan Huang Rui Song Qiang Shen Hao Xu |
author_facet | Ke Zhao Lan Huang Rui Song Qiang Shen Hao Xu |
author_sort | Ke Zhao |
collection | DOAJ |
description | Short text classification is an important problem of natural language processing (NLP), and graph neural networks (GNNs) have been successfully used to solve different NLP problems. However, few studies employ GNN for short text classification, and most of the existing graph-based models ignore sequential information (e.g., word orders) in each document. In this work, we propose an improved sequence-based feature propagation scheme, which fully uses word representation and document-level word interaction and overcomes the limitations of textual features in short texts. On this basis, we utilize this propagation scheme to construct a lightweight model, sequential GNN (SGNN), and its extended model, ESGNN. Specifically, we build individual graphs for each document in the short text corpus based on word co-occurrence and use a bidirectional long short-term memory network (Bi-LSTM) to extract the sequential features of each document; therefore, word nodes in the document graph retain contextual information. Furthermore, two different simplified graph convolutional networks (GCNs) are used to learn word representations based on their local structures. Finally, word nodes combined with sequential information and local information are incorporated as the document representation. Extensive experiments on seven benchmark datasets demonstrate the effectiveness of our method. |
first_indexed | 2024-03-10T04:40:55Z |
format | Article |
id | doaj.art-c4cd70c113424994b522bd89e59c6e5b |
institution | Directory Open Access Journal |
issn | 1999-4893 |
language | English |
last_indexed | 2024-03-10T04:40:55Z |
publishDate | 2021-12-01 |
publisher | MDPI AG |
record_format | Article |
series | Algorithms |
spelling | doaj.art-c4cd70c113424994b522bd89e59c6e5b2023-11-23T03:24:51ZengMDPI AGAlgorithms1999-48932021-12-01141235210.3390/a14120352A Sequential Graph Neural Network for Short Text ClassificationKe Zhao0Lan Huang1Rui Song2Qiang Shen3Hao Xu4College of Software, Jilin University, Changchun 130012, ChinaCollege of Computer Science and Technology, Jilin University, Changchun 130012, ChinaSchool of Artificial Intelligence, Jilin University, Changchun 130012, ChinaCollege of Computer Science and Technology, Jilin University, Changchun 130012, ChinaCollege of Computer Science and Technology, Jilin University, Changchun 130012, ChinaShort text classification is an important problem of natural language processing (NLP), and graph neural networks (GNNs) have been successfully used to solve different NLP problems. However, few studies employ GNN for short text classification, and most of the existing graph-based models ignore sequential information (e.g., word orders) in each document. In this work, we propose an improved sequence-based feature propagation scheme, which fully uses word representation and document-level word interaction and overcomes the limitations of textual features in short texts. On this basis, we utilize this propagation scheme to construct a lightweight model, sequential GNN (SGNN), and its extended model, ESGNN. Specifically, we build individual graphs for each document in the short text corpus based on word co-occurrence and use a bidirectional long short-term memory network (Bi-LSTM) to extract the sequential features of each document; therefore, word nodes in the document graph retain contextual information. Furthermore, two different simplified graph convolutional networks (GCNs) are used to learn word representations based on their local structures. Finally, word nodes combined with sequential information and local information are incorporated as the document representation. Extensive experiments on seven benchmark datasets demonstrate the effectiveness of our method.https://www.mdpi.com/1999-4893/14/12/352graph neural networksshort text classificationsequential features |
spellingShingle | Ke Zhao Lan Huang Rui Song Qiang Shen Hao Xu A Sequential Graph Neural Network for Short Text Classification Algorithms graph neural networks short text classification sequential features |
title | A Sequential Graph Neural Network for Short Text Classification |
title_full | A Sequential Graph Neural Network for Short Text Classification |
title_fullStr | A Sequential Graph Neural Network for Short Text Classification |
title_full_unstemmed | A Sequential Graph Neural Network for Short Text Classification |
title_short | A Sequential Graph Neural Network for Short Text Classification |
title_sort | sequential graph neural network for short text classification |
topic | graph neural networks short text classification sequential features |
url | https://www.mdpi.com/1999-4893/14/12/352 |
work_keys_str_mv | AT kezhao asequentialgraphneuralnetworkforshorttextclassification AT lanhuang asequentialgraphneuralnetworkforshorttextclassification AT ruisong asequentialgraphneuralnetworkforshorttextclassification AT qiangshen asequentialgraphneuralnetworkforshorttextclassification AT haoxu asequentialgraphneuralnetworkforshorttextclassification AT kezhao sequentialgraphneuralnetworkforshorttextclassification AT lanhuang sequentialgraphneuralnetworkforshorttextclassification AT ruisong sequentialgraphneuralnetworkforshorttextclassification AT qiangshen sequentialgraphneuralnetworkforshorttextclassification AT haoxu sequentialgraphneuralnetworkforshorttextclassification |