Revisiting Multi-Domain Machine Translation
AbstractWhen building machine translation systems, one often needs to make the best out of heterogeneous sets of parallel data in training, and to robustly handle inputs from unexpected domains in testing. This multi-domain scenario has attracted a lot of recent work that fall under...
Main Authors: | MinhQuang Pham, Josep Maria Crego, François Yvon |
---|---|
Format: | Article |
Language: | English |
Published: |
The MIT Press
2021-01-01
|
Series: | Transactions of the Association for Computational Linguistics |
Online Access: | https://direct.mit.edu/tacl/article/doi/10.1162/tacl_a_00351/97775/Revisiting-Multi-Domain-Machine-Translation |
Similar Items
-
Revisiting Negation in Neural Machine Translation
by: Gongbo Tang, et al.
Published: (2021-01-01) -
Exploring Composite Indexes for Domain Adaptation in Neural Machine Translation
by: Nhan Vo Minh, et al.
Published: (2024-02-01) -
Revisiting Back-Translation for Low-Resource Machine Translation Between Chinese and Vietnamese
by: Hongzheng Li, et al.
Published: (2020-01-01) -
Multi-Provider and Multi-Domain Resource Orchestration in Network Functions Virtualization
by: Tuan-Minh Pham, et al.
Published: (2019-01-01) -
Co-Translational Folding of Multi-Domain Proteins
by: Nandakumar Rajasekaran, et al.
Published: (2022-04-01)