论文标题

RAAT:在文档级事件提取中使用关系建模的关系增强的注意力变压器

RAAT: Relation-Augmented Attention Transformer for Relation Modeling in Document-Level Event Extraction

论文作者

Liang, Yuan, Jiang, Zhuoxuan, Yin, Di, Ren, Bo

论文摘要

在文档级事件提取(DEE)任务中,事件参数始终散布在句子跨句子(串行问题),并且多个事件可能位于一个文档(多事件问题)中。在本文中,我们认为事件参数的关系信息对于解决上述两个问题具有重要意义,并提出了一个新的DEE框架,该框架可以对关系依赖关系进行建模,称为关系启动的文档级事件提取(Redee)。更具体地说,该框架具有一个新颖的量身定制的变压器,称为关系增强的注意变形金刚(RAAT)。 RAAT可扩展以捕获多尺度和多启动参数关系。为了进一步利用关系信息,我们介绍了一个单独的事件关系预测任务,并采用多任务学习方法来显式增强事件提取性能。广泛的实验证明了该方法的有效性,该方法可以在两个公共数据集上实现最新性能。我们的代码可在https:// github上找到。 com/tencentyouturesearch/raat。

In document-level event extraction (DEE) task, event arguments always scatter across sentences (across-sentence issue) and multiple events may lie in one document (multi-event issue). In this paper, we argue that the relation information of event arguments is of great significance for addressing the above two issues, and propose a new DEE framework which can model the relation dependencies, called Relation-augmented Document-level Event Extraction (ReDEE). More specifically, this framework features a novel and tailored transformer, named as Relation-augmented Attention Transformer (RAAT). RAAT is scalable to capture multi-scale and multi-amount argument relations. To further leverage relation information, we introduce a separate event relation prediction task and adopt multi-task learning method to explicitly enhance event extraction performance. Extensive experiments demonstrate the effectiveness of the proposed method, which can achieve state-of-the-art performance on two public datasets. Our code is available at https://github. com/TencentYoutuResearch/RAAT.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源