Language as a latent sequence: Deep latent variable models for semi-supervised paraphrase generation
This paper explores deep latent variable models for semi-supervised paraphrase generation, where the missing target pair for unlabelled data is modelled as a latent paraphrase sequence. We present a novel unsupervised model named variational sequence auto-encoding reconstruction (VSAR), which perfor...
Main Authors: | Jialin Yu, Alexandra I. Cristea, Anoushka Harit, Zhongtian Sun, Olanrewaju Tahir Aduragba, Lei Shi, Noura Al Moubayed |
---|---|
Format: | Article |
Language: | English |
Published: |
KeAi Communications Co. Ltd.
2023-01-01
|
Series: | AI Open |
Subjects: | |
Online Access: | http://www.sciencedirect.com/science/article/pii/S2666651023000025 |
Similar Items
-
Ask me in your own words: paraphrasing for multitask question answering
by: G. Thomas Hudson, et al.
Published: (2021-10-01) -
Robust Semi-Supervised Point Cloud Registration via Latent GMM-Based Correspondence
by: Zhengyan Zhang, et al.
Published: (2023-09-01) -
Meta-Semi: A Meta-Learning Approach for Semi-Supervised Learning
by: Yulin Wang, et al.
Published: (2022-12-01) -
A Semi-Supervised Paraphrase Identification Model Based on Multi-Granularity Interaction Reasoning
by: Xu Li, et al.
Published: (2020-01-01) -
Corpus-Based Paraphrase Detection Experiments and Review
by: Tedo Vrbanec, et al.
Published: (2020-04-01)