Neural machine translation model combining dependency syntax and LSTM

For the problem of the lack of linguistic knowledge in the neural machine translation model, which is called Transformer, and the insufficient flexibility of positional encoding, this paper introduces the dependency syntax analysis and the long short-term memory network. The source language syntacti...

Full description

Bibliographic Details
Main Authors: Zheng Xin, Chen Hailong, Ma Yuqun, Wang Qing
Format: Article
Language:English
Published: EDP Sciences 2022-01-01
Series:ITM Web of Conferences
Subjects:
Online Access:https://www.itm-conferences.org/articles/itmconf/pdf/2022/07/itmconf_cccar2022_02038.pdf
Description
Summary:For the problem of the lack of linguistic knowledge in the neural machine translation model, which is called Transformer, and the insufficient flexibility of positional encoding, this paper introduces the dependency syntax analysis and the long short-term memory network. The source language syntactic structure information is constructed in the neural machine translation system, and the more accurate position information is obtained by using the memory characteristics of LSTM. Experiments show that using the improved model improves by 1.23 BLEU points in the translation task of the IWSLT14 Chinese-English language pair.
ISSN:2271-2097