Text Summarization Method Based on Gated Attention Graph Neural Network

Text summarization is an information compression technology to extract important information from long text, which has become a challenging research direction in the field of natural language processing. At present, the text summary model based on deep learning has shown good results, but how to mor...

Full description

Bibliographic Details
Main Authors: Jingui Huang, Wenya Wu, Jingyi Li, Shengchun Wang
Format: Article
Language:English
Published: MDPI AG 2023-02-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/23/3/1654
_version_ 1797623216408100864
author Jingui Huang
Wenya Wu
Jingyi Li
Shengchun Wang
author_facet Jingui Huang
Wenya Wu
Jingyi Li
Shengchun Wang
author_sort Jingui Huang
collection DOAJ
description Text summarization is an information compression technology to extract important information from long text, which has become a challenging research direction in the field of natural language processing. At present, the text summary model based on deep learning has shown good results, but how to more effectively model the relationship between words, more accurately extract feature information and eliminate redundant information is still a problem of concern. This paper proposes a graph neural network model GA-GNN based on gated attention, which effectively improves the accuracy and readability of text summarization. First, the words are encoded using a concatenated sentence encoder to generate a deeper vector containing local and global semantic information. Secondly, the ability to extract key information features is improved by using gated attention units to eliminate local irrelevant information. Finally, the loss function is optimized from the three aspects of contrastive learning, confidence calculation of important sentences, and graph feature extraction to improve the robustness of the model. Experimental validation was conducted on a CNN/Daily Mail dataset and MR dataset, and the results showed that the model in this paper outperformed existing methods.
first_indexed 2024-03-11T09:25:31Z
format Article
id doaj.art-aeecfb729a90405c96efe9a346061a41
institution Directory Open Access Journal
issn 1424-8220
language English
last_indexed 2024-03-11T09:25:31Z
publishDate 2023-02-01
publisher MDPI AG
record_format Article
series Sensors
spelling doaj.art-aeecfb729a90405c96efe9a346061a412023-11-16T18:04:12ZengMDPI AGSensors1424-82202023-02-01233165410.3390/s23031654Text Summarization Method Based on Gated Attention Graph Neural NetworkJingui Huang0Wenya Wu1Jingyi Li2Shengchun Wang3College of Information Science and Engineering, Hunan Normal University, Changsha 410081, ChinaCollege of Information Science and Engineering, Hunan Normal University, Changsha 410081, ChinaCollege of Information Science and Engineering, Hunan Normal University, Changsha 410081, ChinaCollege of Information Science and Engineering, Hunan Normal University, Changsha 410081, ChinaText summarization is an information compression technology to extract important information from long text, which has become a challenging research direction in the field of natural language processing. At present, the text summary model based on deep learning has shown good results, but how to more effectively model the relationship between words, more accurately extract feature information and eliminate redundant information is still a problem of concern. This paper proposes a graph neural network model GA-GNN based on gated attention, which effectively improves the accuracy and readability of text summarization. First, the words are encoded using a concatenated sentence encoder to generate a deeper vector containing local and global semantic information. Secondly, the ability to extract key information features is improved by using gated attention units to eliminate local irrelevant information. Finally, the loss function is optimized from the three aspects of contrastive learning, confidence calculation of important sentences, and graph feature extraction to improve the robustness of the model. Experimental validation was conducted on a CNN/Daily Mail dataset and MR dataset, and the results showed that the model in this paper outperformed existing methods.https://www.mdpi.com/1424-8220/23/3/1654encoder-decoderGNNcontrastive learningconfidence calculation of important sentencesattention mechanism
spellingShingle Jingui Huang
Wenya Wu
Jingyi Li
Shengchun Wang
Text Summarization Method Based on Gated Attention Graph Neural Network
Sensors
encoder-decoder
GNN
contrastive learning
confidence calculation of important sentences
attention mechanism
title Text Summarization Method Based on Gated Attention Graph Neural Network
title_full Text Summarization Method Based on Gated Attention Graph Neural Network
title_fullStr Text Summarization Method Based on Gated Attention Graph Neural Network
title_full_unstemmed Text Summarization Method Based on Gated Attention Graph Neural Network
title_short Text Summarization Method Based on Gated Attention Graph Neural Network
title_sort text summarization method based on gated attention graph neural network
topic encoder-decoder
GNN
contrastive learning
confidence calculation of important sentences
attention mechanism
url https://www.mdpi.com/1424-8220/23/3/1654
work_keys_str_mv AT jinguihuang textsummarizationmethodbasedongatedattentiongraphneuralnetwork
AT wenyawu textsummarizationmethodbasedongatedattentiongraphneuralnetwork
AT jingyili textsummarizationmethodbasedongatedattentiongraphneuralnetwork
AT shengchunwang textsummarizationmethodbasedongatedattentiongraphneuralnetwork