A Survey of the State-of-the-Art Models in Neural Abstractive Text Summarization

Dealing with vast amounts of textual data requires the use of efficient systems. Automatic summarization systems are capable of addressing this issue. Therefore, it becomes highly essential to work on the design of existing automatic summarization systems and innovate them to make them capable of me...

Full description

Bibliographic Details
Main Authors: Ayesha Ayub Syed, Ford Lumban Gaol, Tokuro Matsuo
Format: Article
Language:English
Published: IEEE 2021-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9328413/
_version_ 1818562788179574784
author Ayesha Ayub Syed
Ford Lumban Gaol
Tokuro Matsuo
author_facet Ayesha Ayub Syed
Ford Lumban Gaol
Tokuro Matsuo
author_sort Ayesha Ayub Syed
collection DOAJ
description Dealing with vast amounts of textual data requires the use of efficient systems. Automatic summarization systems are capable of addressing this issue. Therefore, it becomes highly essential to work on the design of existing automatic summarization systems and innovate them to make them capable of meeting the demands of continuously increasing data, based on user needs. This study tends to survey the scientific literature to obtain information and knowledge about the recent research in automatic text summarization specifically abstractive summarization based on neural networks. A review of various neural networks based abstractive summarization models have been presented. The proposed conceptual framework includes five key elements identified as encoder-decoder architecture, mechanisms, training strategies and optimization algorithms, dataset, and evaluation metric. A description of these elements is also included in this article. The purpose of this research is to provide an overall understanding and familiarity with the elements of recent neural networks based abstractive text summarization models with an up-to-date review as well as to render an awareness of the challenges and issues with these systems. Analysis has been performed qualitatively with the help of a concept matrix indicating common trends in the design of recent neural abstractive summarization systems. Models employing a transformer-based encoder-decoder architecture are found to be the new state-of-the-art. Based on the knowledge acquired from the survey, this article suggests the use of pre-trained language models in complement with neural network architecture for abstractive summarization task.
first_indexed 2024-12-14T01:08:19Z
format Article
id doaj.art-a936d782270e475c9307de25214a603e
institution Directory Open Access Journal
issn 2169-3536
language English
last_indexed 2024-12-14T01:08:19Z
publishDate 2021-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj.art-a936d782270e475c9307de25214a603e2022-12-21T23:22:52ZengIEEEIEEE Access2169-35362021-01-019132481326510.1109/ACCESS.2021.30527839328413A Survey of the State-of-the-Art Models in Neural Abstractive Text SummarizationAyesha Ayub Syed0https://orcid.org/0000-0002-3113-8980Ford Lumban Gaol1https://orcid.org/0000-0002-5116-5708Tokuro Matsuo2Department of Doctor of Computer Science, Bina Nusantara University, Jakarta, IndonesiaDepartment of Doctor of Computer Science, Bina Nusantara University, Jakarta, IndonesiaGraduate School of Industrial Technology, Advanced Institute of Industrial Technology, Tokyo, JapanDealing with vast amounts of textual data requires the use of efficient systems. Automatic summarization systems are capable of addressing this issue. Therefore, it becomes highly essential to work on the design of existing automatic summarization systems and innovate them to make them capable of meeting the demands of continuously increasing data, based on user needs. This study tends to survey the scientific literature to obtain information and knowledge about the recent research in automatic text summarization specifically abstractive summarization based on neural networks. A review of various neural networks based abstractive summarization models have been presented. The proposed conceptual framework includes five key elements identified as encoder-decoder architecture, mechanisms, training strategies and optimization algorithms, dataset, and evaluation metric. A description of these elements is also included in this article. The purpose of this research is to provide an overall understanding and familiarity with the elements of recent neural networks based abstractive text summarization models with an up-to-date review as well as to render an awareness of the challenges and issues with these systems. Analysis has been performed qualitatively with the help of a concept matrix indicating common trends in the design of recent neural abstractive summarization systems. Models employing a transformer-based encoder-decoder architecture are found to be the new state-of-the-art. Based on the knowledge acquired from the survey, this article suggests the use of pre-trained language models in complement with neural network architecture for abstractive summarization task.https://ieeexplore.ieee.org/document/9328413/Abstractive text summarizationencoderdecodertrainingoptimizationevaluation
spellingShingle Ayesha Ayub Syed
Ford Lumban Gaol
Tokuro Matsuo
A Survey of the State-of-the-Art Models in Neural Abstractive Text Summarization
IEEE Access
Abstractive text summarization
encoder
decoder
training
optimization
evaluation
title A Survey of the State-of-the-Art Models in Neural Abstractive Text Summarization
title_full A Survey of the State-of-the-Art Models in Neural Abstractive Text Summarization
title_fullStr A Survey of the State-of-the-Art Models in Neural Abstractive Text Summarization
title_full_unstemmed A Survey of the State-of-the-Art Models in Neural Abstractive Text Summarization
title_short A Survey of the State-of-the-Art Models in Neural Abstractive Text Summarization
title_sort survey of the state of the art models in neural abstractive text summarization
topic Abstractive text summarization
encoder
decoder
training
optimization
evaluation
url https://ieeexplore.ieee.org/document/9328413/
work_keys_str_mv AT ayeshaayubsyed asurveyofthestateoftheartmodelsinneuralabstractivetextsummarization
AT fordlumbangaol asurveyofthestateoftheartmodelsinneuralabstractivetextsummarization
AT tokuromatsuo asurveyofthestateoftheartmodelsinneuralabstractivetextsummarization
AT ayeshaayubsyed surveyofthestateoftheartmodelsinneuralabstractivetextsummarization
AT fordlumbangaol surveyofthestateoftheartmodelsinneuralabstractivetextsummarization
AT tokuromatsuo surveyofthestateoftheartmodelsinneuralabstractivetextsummarization