PTR: Prompt Tuning with Rules for Text Classification
Recently, prompt tuning has been widely applied to stimulate the rich knowledge in pre-trained language models (PLMs) to serve NLP tasks. Although prompt tuning has achieved promising results on some few-class classification tasks, such as sentiment classification and natural language inference, man...
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
KeAi Communications Co. Ltd.
2022-01-01
|
Series: | AI Open |
Subjects: | |
Online Access: | http://www.sciencedirect.com/science/article/pii/S2666651022000183 |
_version_ | 1797977775216263168 |
---|---|
author | Xu Han Weilin Zhao Ning Ding Zhiyuan Liu Maosong Sun |
author_facet | Xu Han Weilin Zhao Ning Ding Zhiyuan Liu Maosong Sun |
author_sort | Xu Han |
collection | DOAJ |
description | Recently, prompt tuning has been widely applied to stimulate the rich knowledge in pre-trained language models (PLMs) to serve NLP tasks. Although prompt tuning has achieved promising results on some few-class classification tasks, such as sentiment classification and natural language inference, manually designing prompts is cumbersome. Meanwhile, generating prompts automatically is also difficult and time-consuming. Therefore, obtaining effective prompts for complex many-class classification tasks still remains a challenge. In this paper, we propose to encode the prior knowledge of a classification task into rules, then design sub-prompts according to the rules, and finally combine the sub-prompts to handle the task. We name this Prompt Tuning method with Rules “PTR”. Compared with existing prompt-based methods, PTR achieves a good trade-off between effectiveness and efficiency in building prompts. We conduct experiments on three many-class classification tasks, including relation classification, entity typing, and intent classification. The results show that PTR outperforms both vanilla and prompt tuning baselines, indicating the effectiveness of utilizing rules for prompt tuning. The source code of PTR is available at https://github.com/thunlp/PTR. |
first_indexed | 2024-04-11T05:12:23Z |
format | Article |
id | doaj.art-df633041b94c4e33855ea37c2b4552ec |
institution | Directory Open Access Journal |
issn | 2666-6510 |
language | English |
last_indexed | 2024-04-11T05:12:23Z |
publishDate | 2022-01-01 |
publisher | KeAi Communications Co. Ltd. |
record_format | Article |
series | AI Open |
spelling | doaj.art-df633041b94c4e33855ea37c2b4552ec2022-12-25T04:19:33ZengKeAi Communications Co. Ltd.AI Open2666-65102022-01-013182192PTR: Prompt Tuning with Rules for Text ClassificationXu Han0Weilin Zhao1Ning Ding2Zhiyuan Liu3Maosong Sun4Dept. of Comp. Sci. & Tech., Institute for AI, Tsinghua University, Beijing National Research Center for Information Science and Technology, ChinaDept. of Comp. Sci. & Tech., Institute for AI, Tsinghua University, Beijing National Research Center for Information Science and Technology, ChinaDept. of Comp. Sci. & Tech., Institute for AI, Tsinghua University, Beijing National Research Center for Information Science and Technology, ChinaDept. of Comp. Sci. & Tech., Institute for AI, Tsinghua University, Beijing National Research Center for Information Science and Technology, China; Institute Guo Qiang, Tsinghua University, China; International Innovation Center of Tsinghua University, China; Beijing Academy of Artificial Intelligence, BAAI, China; Corresponding authors.Dept. of Comp. Sci. & Tech., Institute for AI, Tsinghua University, Beijing National Research Center for Information Science and Technology, China; Institute Guo Qiang, Tsinghua University, China; International Innovation Center of Tsinghua University, China; Beijing Academy of Artificial Intelligence, BAAI, China; Corresponding authors.Recently, prompt tuning has been widely applied to stimulate the rich knowledge in pre-trained language models (PLMs) to serve NLP tasks. Although prompt tuning has achieved promising results on some few-class classification tasks, such as sentiment classification and natural language inference, manually designing prompts is cumbersome. Meanwhile, generating prompts automatically is also difficult and time-consuming. Therefore, obtaining effective prompts for complex many-class classification tasks still remains a challenge. In this paper, we propose to encode the prior knowledge of a classification task into rules, then design sub-prompts according to the rules, and finally combine the sub-prompts to handle the task. We name this Prompt Tuning method with Rules “PTR”. Compared with existing prompt-based methods, PTR achieves a good trade-off between effectiveness and efficiency in building prompts. We conduct experiments on three many-class classification tasks, including relation classification, entity typing, and intent classification. The results show that PTR outperforms both vanilla and prompt tuning baselines, indicating the effectiveness of utilizing rules for prompt tuning. The source code of PTR is available at https://github.com/thunlp/PTR.http://www.sciencedirect.com/science/article/pii/S2666651022000183Pre-trained language modelsPrompt tuning |
spellingShingle | Xu Han Weilin Zhao Ning Ding Zhiyuan Liu Maosong Sun PTR: Prompt Tuning with Rules for Text Classification AI Open Pre-trained language models Prompt tuning |
title | PTR: Prompt Tuning with Rules for Text Classification |
title_full | PTR: Prompt Tuning with Rules for Text Classification |
title_fullStr | PTR: Prompt Tuning with Rules for Text Classification |
title_full_unstemmed | PTR: Prompt Tuning with Rules for Text Classification |
title_short | PTR: Prompt Tuning with Rules for Text Classification |
title_sort | ptr prompt tuning with rules for text classification |
topic | Pre-trained language models Prompt tuning |
url | http://www.sciencedirect.com/science/article/pii/S2666651022000183 |
work_keys_str_mv | AT xuhan ptrprompttuningwithrulesfortextclassification AT weilinzhao ptrprompttuningwithrulesfortextclassification AT ningding ptrprompttuningwithrulesfortextclassification AT zhiyuanliu ptrprompttuningwithrulesfortextclassification AT maosongsun ptrprompttuningwithrulesfortextclassification |