Zero-shot text classification via self-supervised tuning

Existing solutions to zero-shot text classification either conduct prompting with pre-trained language models, which is sensitive to the choices of templates, or rely on large-scale annotated data of relevant tasks for meta-tuning. In this work, we propose a new paradigm based on self-supervised...

ver descrição completa

Detalhes bibliográficos
Principais autores: Liu, Chaoqun, Zhang, Wenxuan, Chen, Guizhen, Wu, Xiaobao, Luu, Anh Tuan, Chang, Chip Hong, Bing, Lidong
Outros Autores: Interdisciplinary Graduate School (IGS)
Formato: Conference Paper
Idioma:English
Publicado em: 2023
Assuntos:
Acesso em linha:https://hdl.handle.net/10356/168505
https://2023.aclweb.org/