Investigating the Pre-Training Bias in Low-Resource Abstractive Summarization
Recent advances in low-resource abstractive summarization were largely made through the adoption of specialized pre-training, pseudo-summarization, that integrates the content selection knowledge through various centrality-based sentence recovery tasks. However, despite the substantial results, ther...
Main Authors: | Daniil Chernyshev, Boris Dobrov |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2024-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/10474365/ |
Similar Items
-
A literature review of abstractive summarization methods
by: D. V. Shypik, et al.
Published: (2019-12-01) -
Abstractive vs. Extractive Summarization: An Experimental Review
by: Nikolaos Giarelis, et al.
Published: (2023-06-01) -
Abstractive text summarization using Pre-Trained Language Model "Text-to-Text Transfer Transformer (T5)"
by: Qurrota A’yuna Itsnaini, et al.
Published: (2023-04-01) -
Reinforced Abstractive Text Summarization With Semantic Added Reward
by: Heewon Jang, et al.
Published: (2021-01-01) -
WHORU: Improving Abstractive Dialogue Summarization with Personal Pronoun Resolution
by: Tingting Zhou
Published: (2023-07-01)