Neural Machine Translation with CARU-Embedding Layer and CARU-Gated Attention Layer

The attention mechanism performs well for the Neural Machine Translation (NMT) task, but heavily depends on the context vectors generated by the attention network to predict target words. This reliance raises the issue of long-term dependencies. Indeed, it is very common to combine predicates with p...

Full description

Bibliographic Details
Main Authors: Sio-Kei Im, Ka-Hou Chan
Format: Article
Language:English
Published: MDPI AG 2024-03-01
Series:Mathematics
Subjects:
Online Access:https://www.mdpi.com/2227-7390/12/7/997