论文标题

$ m^4apapter $:与元适配器的多种语言多域改编

$m^4Adapter$: Multilingual Multi-Domain Adaptation for Machine Translation with a Meta-Adapter

论文作者

Lai, Wen, Chronopoulou, Alexandra, Fraser, Alexander

论文摘要

在训练时看到的数据和语言对的数据评估时,多语言神经机器翻译模型(MNMT)会产生最先进的性能。但是,当使用MNMT模型在域移动或新语言对下转换时,性能会急剧下降。我们认为一个非常具有挑战性的场景:将MNMT模型改编为新的域和新语言对。在本文中,我们提出了$ m^4apapter $(用于机器翻译的多语言多域改编版与元适配器),该元使用使用元学习与适配器结合了域和语言知识。我们提出的结果表明,我们的方法是一种参数有效的解决方案,可以有效地调整模型对新语言对和新域,同时超过其他适配器方法。一项消融研究还表明,我们的方法更有效地跨不同的语言和语言信息传输域知识。

Multilingual neural machine translation models (MNMT) yield state-of-the-art performance when evaluated on data from a domain and language pair seen at training time. However, when a MNMT model is used to translate under domain shift or to a new language pair, performance drops dramatically. We consider a very challenging scenario: adapting the MNMT model both to a new domain and to a new language pair at the same time. In this paper, we propose $m^4Adapter$ (Multilingual Multi-Domain Adaptation for Machine Translation with a Meta-Adapter), which combines domain and language knowledge using meta-learning with adapters. We present results showing that our approach is a parameter-efficient solution which effectively adapts a model to both a new language pair and a new domain, while outperforming other adapter methods. An ablation study also shows that our approach more effectively transfers domain knowledge across different languages and language information across different domains.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源