论文标题

gpt-too:AMR到文本生成的语言模型优先方法

GPT-too: A language-model-first approach for AMR-to-text generation

论文作者

Mager, Manuel, Astudillo, Ramon Fernandez, Naseem, Tahira, Sultan, Md Arafat, Lee, Young-Suk, Florian, Radu, Roukos, Salim

论文摘要

含义表示(AMR)是宽覆盖级级语义图。从AMR生成文本的现有方法仅关注AMR注释数据上的训练顺序到序列到序列或图形模型。在本文中,我们提出了一种替代方法,该方法将强大的预训练的语言模型与基于周期一致性的重新得分结合在一起。尽管方法很简单,但我们的实验结果表明,这些模型的表现优于英语LDC2017T10DATASET的所有以前的所有技术,包括最近使用变压器体系结构。除了标准评估指标外,我们还提供人类评估实验,以进一步证实我们方法的强度。

Meaning Representations (AMRs) are broad-coverage sentence-level semantic graphs. Existing approaches to generating text from AMR have focused on training sequence-to-sequence or graph-to-sequence models on AMR annotated data only. In this paper, we propose an alternative approach that combines a strong pre-trained language model with cycle consistency-based re-scoring. Despite the simplicity of the approach, our experimental results show these models outperform all previous techniques on the English LDC2017T10dataset, including the recent use of transformer architectures. In addition to the standard evaluation metrics, we provide human evaluation experiments that further substantiate the strength of our approach.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源