论文标题

命名实体识别的英雄 - 贡神经模型

Hero-Gang Neural Model For Named Entity Recognition

论文作者

Hu, Jinpeng, Shen, Yaling, Liu, Yang, Wan, Xiang, Chang, Tsung-Hui

论文摘要

命名实体识别(NER)是NLP的基本和重要任务,旨在从自由文本中识别命名实体(NES)。最近,由于在变压器模型中应用的多头注意机制可以有效地捕获更长的上下文信息,因此基于变压器的模型已成为主流方法,并在此任务中实现了显着的性能。不幸的是,尽管这些模型可以捕获有效的全球上下文信息,但它们在本地功能和位置信息提取中仍然受到限制,这在NER中至关重要。在本文中,为了解决这一局限性,我们提出了一种新颖的英雄神经结构(HGN),包括英雄和帮派模块,以利用全球和本地信息来促进NER。具体而言,英雄模块由基于变压器的编码器组成,以保持自我发项机制的优势,而帮派模块则利用多窗口复发模块在英雄模块的指导下提取本地功能和位置信息。之后,提出的多窗口注意有效地结合了全局信息和多个局部特征,用于预测实体标签。几个基准数据集的实验结果证明了我们提出的模型的有效性。

Named entity recognition (NER) is a fundamental and important task in NLP, aiming at identifying named entities (NEs) from free text. Recently, since the multi-head attention mechanism applied in the Transformer model can effectively capture longer contextual information, Transformer-based models have become the mainstream methods and have achieved significant performance in this task. Unfortunately, although these models can capture effective global context information, they are still limited in the local feature and position information extraction, which is critical in NER. In this paper, to address this limitation, we propose a novel Hero-Gang Neural structure (HGN), including the Hero and Gang module, to leverage both global and local information to promote NER. Specifically, the Hero module is composed of a Transformer-based encoder to maintain the advantage of the self-attention mechanism, and the Gang module utilizes a multi-window recurrent module to extract local features and position information under the guidance of the Hero module. Afterward, the proposed multi-window attention effectively combines global information and multiple local features for predicting entity labels. Experimental results on several benchmark datasets demonstrate the effectiveness of our proposed model.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源