Domain Adaptation with Pre-trained Transformers for Query-Focused Abstractive Text Summarization
The Query-Focused Text Summarization (QFTS) task aims at building systems that generate the summary of the text document(s) based on the given query. A key challenge in addressing this task is the lack of large labeled data for training the summarization model. In this article, we address this chall...
Main Authors: | Md Tahmid Rahman Laskar, Enamul Hoque, Jimmy Xiangji Huang |
---|---|
Format: | Article |
Language: | English |
Published: |
The MIT Press
2022-03-01
|
Series: | Computational Linguistics |
Online Access: | http://dx.doi.org/10.1162/coli_a_00434 |
Similar Items
-
Abstractive text summarization using Pre-Trained Language Model "Text-to-Text Transfer Transformer (T5)"
by: Qurrota A’yuna Itsnaini, et al.
Published: (2023-04-01) -
Investigation of Pre-Trained Bidirectional Encoder Representations from Transformers Checkpoints for Indonesian Abstractive Text Summarization
by: Lucky, Henry, et al.
Published: (2022) -
Multi-Encoder Transformer for Korean Abstractive Text Summarization
by: Youhyun Shin
Published: (2023-01-01) -
Turkish abstractive text document summarization using text to text transfer transformer
by: Betul Ay, et al.
Published: (2023-04-01) -
Investigating the Pre-Training Bias in Low-Resource Abstractive Summarization
by: Daniil Chernyshev, et al.
Published: (2024-01-01)