Paraphrase Generation Model Integrating Transformer Architecture, Part-of-Speech Features, and Pointer Generator Network
In recent years, hardware advancements have enabled natural language processing tasks that were previously difficult to achieve due to their intense computing requirements. This study focuses on paraphrase generation, which entails rewriting a sentence using different words and sentence structures w...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2023-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/10078879/ |
_version_ | 1797855326521786368 |
---|---|
author | Yu-Chia Tsai Feng-Cheng Lin |
author_facet | Yu-Chia Tsai Feng-Cheng Lin |
author_sort | Yu-Chia Tsai |
collection | DOAJ |
description | In recent years, hardware advancements have enabled natural language processing tasks that were previously difficult to achieve due to their intense computing requirements. This study focuses on paraphrase generation, which entails rewriting a sentence using different words and sentence structures while preserving its original meaning. This increases sentence diversity, thereby improving the performance of downstream tasks, such as question–answering systems and machine translation. This study proposes a novel paraphrase generation model that combines the Transformer architecture with part-of-speech features, and this model is trained using a Chinese corpus. New features are incorporated to improve the performance of the Transformer architecture, and the pointer generation network is used when the training data contain low-frequency words. This allows the model to focus on input words with important information according to their attention distributions. |
first_indexed | 2024-04-09T20:21:58Z |
format | Article |
id | doaj.art-53518647672444b7ae75ae7814273d7d |
institution | Directory Open Access Journal |
issn | 2169-3536 |
language | English |
last_indexed | 2024-04-09T20:21:58Z |
publishDate | 2023-01-01 |
publisher | IEEE |
record_format | Article |
series | IEEE Access |
spelling | doaj.art-53518647672444b7ae75ae7814273d7d2023-03-30T23:01:21ZengIEEEIEEE Access2169-35362023-01-0111301093011710.1109/ACCESS.2023.326084910078879Paraphrase Generation Model Integrating Transformer Architecture, Part-of-Speech Features, and Pointer Generator NetworkYu-Chia Tsai0Feng-Cheng Lin1https://orcid.org/0000-0003-0457-3757Department of Information Engineering and Computer Science, Feng Chia University, Taichung, TaiwanDepartment of Information Engineering and Computer Science, Feng Chia University, Taichung, TaiwanIn recent years, hardware advancements have enabled natural language processing tasks that were previously difficult to achieve due to their intense computing requirements. This study focuses on paraphrase generation, which entails rewriting a sentence using different words and sentence structures while preserving its original meaning. This increases sentence diversity, thereby improving the performance of downstream tasks, such as question–answering systems and machine translation. This study proposes a novel paraphrase generation model that combines the Transformer architecture with part-of-speech features, and this model is trained using a Chinese corpus. New features are incorporated to improve the performance of the Transformer architecture, and the pointer generation network is used when the training data contain low-frequency words. This allows the model to focus on input words with important information according to their attention distributions.https://ieeexplore.ieee.org/document/10078879/Multi-encoderparaphrase generationpointer generation networktransformer |
spellingShingle | Yu-Chia Tsai Feng-Cheng Lin Paraphrase Generation Model Integrating Transformer Architecture, Part-of-Speech Features, and Pointer Generator Network IEEE Access Multi-encoder paraphrase generation pointer generation network transformer |
title | Paraphrase Generation Model Integrating Transformer Architecture, Part-of-Speech Features, and Pointer Generator Network |
title_full | Paraphrase Generation Model Integrating Transformer Architecture, Part-of-Speech Features, and Pointer Generator Network |
title_fullStr | Paraphrase Generation Model Integrating Transformer Architecture, Part-of-Speech Features, and Pointer Generator Network |
title_full_unstemmed | Paraphrase Generation Model Integrating Transformer Architecture, Part-of-Speech Features, and Pointer Generator Network |
title_short | Paraphrase Generation Model Integrating Transformer Architecture, Part-of-Speech Features, and Pointer Generator Network |
title_sort | paraphrase generation model integrating transformer architecture part of speech features and pointer generator network |
topic | Multi-encoder paraphrase generation pointer generation network transformer |
url | https://ieeexplore.ieee.org/document/10078879/ |
work_keys_str_mv | AT yuchiatsai paraphrasegenerationmodelintegratingtransformerarchitecturepartofspeechfeaturesandpointergeneratornetwork AT fengchenglin paraphrasegenerationmodelintegratingtransformerarchitecturepartofspeechfeaturesandpointergeneratornetwork |