论文标题
用个性化和完善的图形表示学习
Graph Representation Learning with Individualization and Refinement
论文作者
论文摘要
图形神经网络(GNN)已成为图形结构化数据的表示模型。 GNNS遵循一种消息的方法,传递了类似于图形同构的1维WEISFEILER LEHMAN(1-WL)测试,因此受到1-WL的区别能力的限制。在节点的K-tuples上运行的更具表现力的高阶GNN需要增加计算资源才能处理高阶张量。在这项工作中,我们遵循了个性化和改进的经典方法(IR),而不是WL方法,而是最实用的同构求解器的技术。个性化是指人为地区分图中的节点,而细化是通过消息传递传播到其他节点向其他节点传播的。我们学会自适应选择节点以个性化并在细化后汇总所得图,以帮助处理复杂性。我们的技术使我们能够学习更丰富的节点嵌入,同时保持计算复杂性易于管理。从理论上讲,我们表明我们的过程比1-WL测试更具有表现力。实验表明,在几个基准合成和真实数据集上,我们的方法优于突出的1-WL GNN模型以及竞争性的高阶基线。此外,我们的方法为探索具有个性化和改进的图形结构的学习范式打开了新的大门。
Graph Neural Networks (GNNs) have emerged as prominent models for representation learning on graph structured data. GNNs follow an approach of message passing analogous to 1-dimensional Weisfeiler Lehman (1-WL) test for graph isomorphism and consequently are limited by the distinguishing power of 1-WL. More expressive higher-order GNNs which operate on k-tuples of nodes need increased computational resources in order to process higher-order tensors. Instead of the WL approach, in this work, we follow the classical approach of Individualization and Refinement (IR), a technique followed by most practical isomorphism solvers. Individualization refers to artificially distinguishing a node in the graph and refinement is the propagation of this information to other nodes through message passing. We learn to adaptively select nodes to individualize and to aggregate the resulting graphs after refinement to help handle the complexity. Our technique lets us learn richer node embeddings while keeping the computational complexity manageable. Theoretically, we show that our procedure is more expressive than the 1-WL test. Experiments show that our method outperforms prominent 1-WL GNN models as well as competitive higher-order baselines on several benchmark synthetic and real datasets. Furthermore, our method opens new doors for exploring the paradigm of learning on graph structures with individualization and refinement.