论文标题

句子压缩为删除,并用上下文嵌入

Sentence Compression as Deletion with Contextual Embeddings

论文作者

Nguyen, Minh-Tien, Minh, Bui Cong, Le, Dung Tien, Linh, Le Thai

论文摘要

句子压缩是在保留重要信息的同时创建输入句子的较短版本的任务。在本文中,我们通过使用上下文嵌入来扩展通过删除的压缩任务。与通常使用非上下文嵌入(手套或Word2Vec)的先前工作不同,我们利用上下文嵌入的嵌入,使我们的模型可以捕获输入的上下文。更确切地说,我们利用双向长期记忆和条件随机字段堆叠的上下文嵌入来处理序列标记。基准Google数据集的实验结果表明,通过利用上下文嵌入,我们的模型与领导委员会报告的强大方法相比,我们的模型实现了新的最新F得分。

Sentence compression is the task of creating a shorter version of an input sentence while keeping important information. In this paper, we extend the task of compression by deletion with the use of contextual embeddings. Different from prior work usually using non-contextual embeddings (Glove or Word2Vec), we exploit contextual embeddings that enable our model capturing the context of inputs. More precisely, we utilize contextual embeddings stacked by bidirectional Long-short Term Memory and Conditional Random Fields for dealing with sequence labeling. Experimental results on a benchmark Google dataset show that by utilizing contextual embeddings, our model achieves a new state-of-the-art F-score compared to strong methods reported on the leader board.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源