论文标题

改善多语言神经机器翻译中的目标端词汇转移

Improving Target-side Lexical Transfer in Multilingual Neural Machine Translation

论文作者

Gao, Luyu, Wang, Xinyi, Neubig, Graham

论文摘要

为了改善低资源语言(LRL)的神经机器翻译〜(NMT)的性能,一种有效的策略是利用相关高资源语言〜(HRL)的并行数据。但是,与转化为LRL的NMT模型相比,多语言数据对从LRL转换为目标语言的NMT模型更有益。在本文中,我们旨在提高通过设计更好的解码器单词嵌入来转换\ emph {to} LRL的NMT模型的多语言传输的有效性。在通用多语言编码方法上延伸的软编码编码〜\ citep {sde}时,我们提出了DECSDE,这是一种有效的基于专门为NMT解码器设计的基于有效的字符n-gram嵌入。我们的实验表明,DECSDE在翻译从英语到四种不同语言的翻译方面的一致收益达到1.8 bleu。

To improve the performance of Neural Machine Translation~(NMT) for low-resource languages~(LRL), one effective strategy is to leverage parallel data from a related high-resource language~(HRL). However, multilingual data has been found more beneficial for NMT models that translate from the LRL to a target language than the ones that translate into the LRLs. In this paper, we aim to improve the effectiveness of multilingual transfer for NMT models that translate \emph{into} the LRL, by designing a better decoder word embedding. Extending upon a general-purpose multilingual encoding method Soft Decoupled Encoding~\citep{SDE}, we propose DecSDE, an efficient character n-gram based embedding specifically designed for the NMT decoder. Our experiments show that DecSDE leads to consistent gains of up to 1.8 BLEU on translation from English to four different languages.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源