Mask Transformer: Unpaired Text Style Transfer Based on Masked Language

Currently, most text style transfer methods encode the text into a style-independent latent representation and decode it into new sentences with the target style. Due to the limitation of the latent representation, previous works can hardly get satisfactory target style sentence especially in terms...

Full description

Bibliographic Details
Main Authors: Chunhua Wu, Xiaolong Chen, Xingbiao Li
Format: Article
Language:English
Published: MDPI AG 2020-09-01
Series:Applied Sciences
Subjects:
Online Access:https://www.mdpi.com/2076-3417/10/18/6196
_version_ 1797554458134052864
author Chunhua Wu
Xiaolong Chen
Xingbiao Li
author_facet Chunhua Wu
Xiaolong Chen
Xingbiao Li
author_sort Chunhua Wu
collection DOAJ
description Currently, most text style transfer methods encode the text into a style-independent latent representation and decode it into new sentences with the target style. Due to the limitation of the latent representation, previous works can hardly get satisfactory target style sentence especially in terms of semantic remaining of the original sentence. We propose a “Mask and Generation” structure, which can obtain an explicit representation of the content of original sentence and generate the target sentence with a transformer. This explicit representation is a masked text that masks the words with the strong style attribute in the sentence. Therefore, it can preserve most of the semantic meaning of the original sentence. In addition, as it is the input of the generator, it also simplified this process compared to the current work who generate the target sentence from scratch. As the explicit representation is readable and the model has better interpretability, we can clearly know which words changed and why the words changed. We evaluate our model on two review datasets with quantitative, qualitative, and human evaluations. The experimental results show that our model generally outperform other methods in terms of transfer accuracy and content preservation.
first_indexed 2024-03-10T16:32:21Z
format Article
id doaj.art-eab5aa0749e641aaa41848db81264da5
institution Directory Open Access Journal
issn 2076-3417
language English
last_indexed 2024-03-10T16:32:21Z
publishDate 2020-09-01
publisher MDPI AG
record_format Article
series Applied Sciences
spelling doaj.art-eab5aa0749e641aaa41848db81264da52023-11-20T12:46:44ZengMDPI AGApplied Sciences2076-34172020-09-011018619610.3390/app10186196Mask Transformer: Unpaired Text Style Transfer Based on Masked LanguageChunhua Wu0Xiaolong Chen1Xingbiao Li2School of Cyberspace Security, Beijing University of Posts and Telecommunications, Beijing 100876, ChinaSchool of Cyberspace Security, Beijing University of Posts and Telecommunications, Beijing 100876, ChinaSchool of Cyberspace Security, Beijing University of Posts and Telecommunications, Beijing 100876, ChinaCurrently, most text style transfer methods encode the text into a style-independent latent representation and decode it into new sentences with the target style. Due to the limitation of the latent representation, previous works can hardly get satisfactory target style sentence especially in terms of semantic remaining of the original sentence. We propose a “Mask and Generation” structure, which can obtain an explicit representation of the content of original sentence and generate the target sentence with a transformer. This explicit representation is a masked text that masks the words with the strong style attribute in the sentence. Therefore, it can preserve most of the semantic meaning of the original sentence. In addition, as it is the input of the generator, it also simplified this process compared to the current work who generate the target sentence from scratch. As the explicit representation is readable and the model has better interpretability, we can clearly know which words changed and why the words changed. We evaluate our model on two review datasets with quantitative, qualitative, and human evaluations. The experimental results show that our model generally outperform other methods in terms of transfer accuracy and content preservation.https://www.mdpi.com/2076-3417/10/18/6196natural language processmask languagetransformerstyle transfer
spellingShingle Chunhua Wu
Xiaolong Chen
Xingbiao Li
Mask Transformer: Unpaired Text Style Transfer Based on Masked Language
Applied Sciences
natural language process
mask language
transformer
style transfer
title Mask Transformer: Unpaired Text Style Transfer Based on Masked Language
title_full Mask Transformer: Unpaired Text Style Transfer Based on Masked Language
title_fullStr Mask Transformer: Unpaired Text Style Transfer Based on Masked Language
title_full_unstemmed Mask Transformer: Unpaired Text Style Transfer Based on Masked Language
title_short Mask Transformer: Unpaired Text Style Transfer Based on Masked Language
title_sort mask transformer unpaired text style transfer based on masked language
topic natural language process
mask language
transformer
style transfer
url https://www.mdpi.com/2076-3417/10/18/6196
work_keys_str_mv AT chunhuawu masktransformerunpairedtextstyletransferbasedonmaskedlanguage
AT xiaolongchen masktransformerunpairedtextstyletransferbasedonmaskedlanguage
AT xingbiaoli masktransformerunpairedtextstyletransferbasedonmaskedlanguage