An Abstractive Summarization Model Based on Joint-Attention Mechanism and a Priori Knowledge
An abstractive summarization model based on the joint-attention mechanism and a priori knowledge is proposed to address the problems of the inadequate semantic understanding of text and summaries that do not conform to human language habits in abstractive summary models. Word vectors that are most r...
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2023-04-01
|
Series: | Applied Sciences |
Subjects: | |
Online Access: | https://www.mdpi.com/2076-3417/13/7/4610 |
_version_ | 1797608223948144640 |
---|---|
author | Yuanyuan Li Yuan Huang Weijian Huang Junhao Yu Zheng Huang |
author_facet | Yuanyuan Li Yuan Huang Weijian Huang Junhao Yu Zheng Huang |
author_sort | Yuanyuan Li |
collection | DOAJ |
description | An abstractive summarization model based on the joint-attention mechanism and a priori knowledge is proposed to address the problems of the inadequate semantic understanding of text and summaries that do not conform to human language habits in abstractive summary models. Word vectors that are most relevant to the original text should be selected first. Second, the original text is represented in two dimensions—word-level and sentence-level, as word vectors and sentence vectors, respectively. After this processing, there will be not only a relationship between word-level vectors but also a relationship between sentence-level vectors, and the decoder discriminates between word-level and sentence-level vectors based on their relationship with the hidden state of the decoder. Then, the pointer generation network is improved using a priori knowledge. Finally, reinforcement learning is used to improve the quality of the generated summaries. Experiments on two classical datasets, CNN/DailyMail and DUC 2004, show that the model has good performance and effectively improves the quality of generated summaries. |
first_indexed | 2024-03-11T05:41:32Z |
format | Article |
id | doaj.art-0a3e68fcdcad4aaf83ecf1b0372c9c8d |
institution | Directory Open Access Journal |
issn | 2076-3417 |
language | English |
last_indexed | 2024-03-11T05:41:32Z |
publishDate | 2023-04-01 |
publisher | MDPI AG |
record_format | Article |
series | Applied Sciences |
spelling | doaj.art-0a3e68fcdcad4aaf83ecf1b0372c9c8d2023-11-17T16:23:05ZengMDPI AGApplied Sciences2076-34172023-04-01137461010.3390/app13074610An Abstractive Summarization Model Based on Joint-Attention Mechanism and a Priori KnowledgeYuanyuan Li0Yuan Huang1Weijian Huang2Junhao Yu3Zheng Huang4School of Information and Electrical Engineering, Hebei University of Engineering, Handan 056038, ChinaSchool of Information and Electrical Engineering, Hebei University of Engineering, Handan 056038, ChinaSchool of Information and Electrical Engineering, Hebei University of Engineering, Handan 056038, ChinaSchool of Information and Electrical Engineering, Hebei University of Engineering, Handan 056038, ChinaSchool of Information and Electrical Engineering, Hebei University of Engineering, Handan 056038, ChinaAn abstractive summarization model based on the joint-attention mechanism and a priori knowledge is proposed to address the problems of the inadequate semantic understanding of text and summaries that do not conform to human language habits in abstractive summary models. Word vectors that are most relevant to the original text should be selected first. Second, the original text is represented in two dimensions—word-level and sentence-level, as word vectors and sentence vectors, respectively. After this processing, there will be not only a relationship between word-level vectors but also a relationship between sentence-level vectors, and the decoder discriminates between word-level and sentence-level vectors based on their relationship with the hidden state of the decoder. Then, the pointer generation network is improved using a priori knowledge. Finally, reinforcement learning is used to improve the quality of the generated summaries. Experiments on two classical datasets, CNN/DailyMail and DUC 2004, show that the model has good performance and effectively improves the quality of generated summaries.https://www.mdpi.com/2076-3417/13/7/4610abstractive summarizationjoint-attention mechanismprior knowledgereinforcement learning |
spellingShingle | Yuanyuan Li Yuan Huang Weijian Huang Junhao Yu Zheng Huang An Abstractive Summarization Model Based on Joint-Attention Mechanism and a Priori Knowledge Applied Sciences abstractive summarization joint-attention mechanism prior knowledge reinforcement learning |
title | An Abstractive Summarization Model Based on Joint-Attention Mechanism and a Priori Knowledge |
title_full | An Abstractive Summarization Model Based on Joint-Attention Mechanism and a Priori Knowledge |
title_fullStr | An Abstractive Summarization Model Based on Joint-Attention Mechanism and a Priori Knowledge |
title_full_unstemmed | An Abstractive Summarization Model Based on Joint-Attention Mechanism and a Priori Knowledge |
title_short | An Abstractive Summarization Model Based on Joint-Attention Mechanism and a Priori Knowledge |
title_sort | abstractive summarization model based on joint attention mechanism and a priori knowledge |
topic | abstractive summarization joint-attention mechanism prior knowledge reinforcement learning |
url | https://www.mdpi.com/2076-3417/13/7/4610 |
work_keys_str_mv | AT yuanyuanli anabstractivesummarizationmodelbasedonjointattentionmechanismandaprioriknowledge AT yuanhuang anabstractivesummarizationmodelbasedonjointattentionmechanismandaprioriknowledge AT weijianhuang anabstractivesummarizationmodelbasedonjointattentionmechanismandaprioriknowledge AT junhaoyu anabstractivesummarizationmodelbasedonjointattentionmechanismandaprioriknowledge AT zhenghuang anabstractivesummarizationmodelbasedonjointattentionmechanismandaprioriknowledge AT yuanyuanli abstractivesummarizationmodelbasedonjointattentionmechanismandaprioriknowledge AT yuanhuang abstractivesummarizationmodelbasedonjointattentionmechanismandaprioriknowledge AT weijianhuang abstractivesummarizationmodelbasedonjointattentionmechanismandaprioriknowledge AT junhaoyu abstractivesummarizationmodelbasedonjointattentionmechanismandaprioriknowledge AT zhenghuang abstractivesummarizationmodelbasedonjointattentionmechanismandaprioriknowledge |