Sequence-Level Training for Non-Autoregressive Neural Machine Translation

AbstractIn recent years, Neural Machine Translation (NMT) has achieved notable results in various translation tasks. However, the word-by-word generation manner determined by the autoregressive mechanism leads to high translation latency of the NMT and restricts its low-latency appli...

Full description

Bibliographic Details
Main Authors: Chenze Shao, Yang Feng, Jinchao Zhang, Fandong Meng, Jie Zhou
Format: Article
Language:English
Published: The MIT Press 2021-12-01
Series:Computational Linguistics
Online Access:https://direct.mit.edu/coli/article/47/4/891/107176/Sequence-Level-Training-for-Non-Autoregressive