PTR: Prompt Tuning with Rules for Text Classification
Recently, prompt tuning has been widely applied to stimulate the rich knowledge in pre-trained language models (PLMs) to serve NLP tasks. Although prompt tuning has achieved promising results on some few-class classification tasks, such as sentiment classification and natural language inference, man...
Main Authors: | Xu Han, Weilin Zhao, Ning Ding, Zhiyuan Liu, Maosong Sun |
---|---|
Format: | Article |
Language: | English |
Published: |
KeAi Communications Co. Ltd.
2022-01-01
|
Series: | AI Open |
Subjects: | |
Online Access: | http://www.sciencedirect.com/science/article/pii/S2666651022000183 |
Similar Items
-
CPT: Colorful Prompt Tuning for pre-trained vision-language models
by: Yuan Yao, et al.
Published: (2024-01-01) -
Cue prompt adapting model for relation extraction
by: Kai Wang, et al.
Published: (2023-12-01) -
Multi-Stage Prompt Tuning for Political Perspective Detection in Low-Resource Settings
by: Kang-Min Kim, et al.
Published: (2023-05-01) -
REKP: Refined External Knowledge into Prompt-Tuning for Few-Shot Text Classification
by: Yuzhuo Dang, et al.
Published: (2023-11-01) -
ConKgPrompt: Contrastive Sample Method Based on Knowledge-Guided Prompt Learning for Text Classification
by: Qian Wang, et al.
Published: (2023-08-01)