论文标题

Betti注意力图是您真正需要的

Betti numbers of attention graphs is all you really need

论文作者

Kushnareva, Laida, Piontkovski, Dmitri, Piontkovskaya, Irina

论文摘要

我们将拓扑分析的方法应用于注意力图,该方法是根据BERT模型的注意力头(ARXIV:1810.04805V2)计算的。我们的研究表明,基于训练有素的神经网络的基本持久拓扑特征(即Betti数字)构建的分类器可以与常规分类方法达到同等的分类结果。我们在三个文本分类基准上展示了此类拓扑文本表示形式的相关性。据我们所知,这是分析基于注意力的神经网络拓扑的首次尝试,该网络广泛用于自然语言处理。

We apply methods of topological analysis to the attention graphs, calculated on the attention heads of the BERT model ( arXiv:1810.04805v2 ). Our research shows that the classifier built upon basic persistent topological features (namely, Betti numbers) of the trained neural network can achieve classification results on par with the conventional classification method. We show the relevance of such topological text representation on three text classification benchmarks. For the best of our knowledge, it is the first attempt to analyze the topology of an attention-based neural network, widely used for Natural Language Processing.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源