论文标题

dotin:删除gnns的任务 - irrevant节点

DOTIN: Dropping Task-Irrelevant Nodes for GNNs

论文作者

Zhang, Shaofeng, Zhu, Feng, Yan, Junchi, Zhao, Rui, Yang, Xiaokang

论文摘要

可伸缩性是深图神经网络的重要考虑因素。受到CNN中常规合并层的启发,许多最近的图形学习方法引入了汇总策略,以减少学习图的大小,从而可以提高可扩展性和效率。但是,这些基于汇总的方法主要是针对单个图形级任务量身定制的,并更多地关注本地信息,从而限制了它们在通常需要特定于任务的全局信息的多任务设置中的性能。在本文中,我们与这些基于汇总的努力背道而驰,我们设计了一种称为dotin(\下划线{d} r \下划线{o} pping \ pping \ usewissline {t} ask- \ supk- \ usew supsline {i} rrelevant \ uneselevant \ unesevant {n} odes)的新方法。具体而言,通过引入$ k $可学习的虚拟节点来代表针对$ k $不同的图形级任务的图形嵌入,最高可与注意模型低下的较低的专注力 - 本文中的变压器,无明显的性能降低,可以适应变压器。达到几乎相同的精度,我们的方法在d \&d DataSet上的图形分类和图形编辑距离(GED)(包括图形分类和图形编辑距离(GED),包括30 \%的存储器(GED),将GAT加速约50 \%。代码将在https://github.com/sherrylone/dotin中公开提供。

Scalability is an important consideration for deep graph neural networks. Inspired by the conventional pooling layers in CNNs, many recent graph learning approaches have introduced the pooling strategy to reduce the size of graphs for learning, such that the scalability and efficiency can be improved. However, these pooling-based methods are mainly tailored to a single graph-level task and pay more attention to local information, limiting their performance in multi-task settings which often require task-specific global information. In this paper, departure from these pooling-based efforts, we design a new approach called DOTIN (\underline{D}r\underline{o}pping \underline{T}ask-\underline{I}rrelevant \underline{N}odes) to reduce the size of graphs. Specifically, by introducing $K$ learnable virtual nodes to represent the graph embeddings targeted to $K$ different graph-level tasks, respectively, up to 90\% raw nodes with low attentiveness with an attention model -- a transformer in this paper, can be adaptively dropped without notable performance decreasing. Achieving almost the same accuracy, our method speeds up GAT by about 50\% on graph-level tasks including graph classification and graph edit distance (GED) with about 60\% less memory, on D\&D dataset. Code will be made publicly available in https://github.com/Sherrylone/DOTIN.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源