论文标题

利用句法结构以进行更好的语言建模:句法距离方法

Exploiting Syntactic Structure for Better Language Modeling: A Syntactic Distance Approach

论文作者

Du, Wenyu, Lin, Zhouhan, Shen, Yikang, O'Donnell, Timothy J., Bengio, Yoshua, Zhang, Yue

论文摘要

人们普遍认为,句法结构的知识应改善语言建模。但是,有效和计算有效地将句法结构纳入神经语言模型一直是一个具有挑战性的话题。在本文中,我们利用一个多任务目标,即,模型同时预测单词以及地面真理树的形式,称为“句法距离”,其中这两个独立的目标之间的信息共享相同的中间表示。 Penn Treebank和中国Treebank数据集的实验结果表明,当将地面真相解析为额外的训练信号时,该模型能够实现较低的困惑,并以更好的质量诱导树木。

It is commonly believed that knowledge of syntactic structure should improve language modeling. However, effectively and computationally efficiently incorporating syntactic structure into neural language models has been a challenging topic. In this paper, we make use of a multi-task objective, i.e., the models simultaneously predict words as well as ground truth parse trees in a form called "syntactic distances", where information between these two separate objectives shares the same intermediate representation. Experimental results on the Penn Treebank and Chinese Treebank datasets show that when ground truth parse trees are provided as additional training signals, the model is able to achieve lower perplexity and induce trees with better quality.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源