论文标题
更少的是:用于归纳文本分类的超盖注意网络
Be More with Less: Hypergraph Attention Networks for Inductive Text Classification
论文作者
论文摘要
文本分类是一个关键的研究主题,在自然语言处理中广泛应用。最近,图形神经网络(GNN)在研究界受到了越来越多的关注,并证明了他们对这项规范任务的有希望的结果。尽管取得了成功,但它们的性能在实践中可能会大大危害:(1)无法捕获单词之间的高阶互动; (2)效率低下,无法处理大型数据集和新文档。为了解决这些问题,在本文中,我们提出了一个原则性的模型-HyperGraph Goade Networks(HyperGat),该模型可以获得更具表现力的能力,而对于文本表示学习的计算消耗较少。各种基准数据集的广泛实验证明了拟议方法对文本分类任务的功效。
Text classification is a critical research topic with broad applications in natural language processing. Recently, graph neural networks (GNNs) have received increasing attention in the research community and demonstrated their promising results on this canonical task. Despite the success, their performance could be largely jeopardized in practice since they are: (1) unable to capture high-order interaction between words; (2) inefficient to handle large datasets and new documents. To address those issues, in this paper, we propose a principled model -- hypergraph attention networks (HyperGAT), which can obtain more expressive power with less computational consumption for text representation learning. Extensive experiments on various benchmark datasets demonstrate the efficacy of the proposed approach on the text classification task.