论文标题

梯度门控于图表上深度多率学习

Gradient Gating for Deep Multi-Rate Learning on Graphs

论文作者

Rusch, T. Konstantin, Chamberlain, Benjamin P., Mahoney, Michael W., Bronstein, Michael M., Mishra, Siddhartha

论文摘要

我们提出了梯度门控(G $^2 $),这是一个新颖的框架,用于提高图形神经网络(GNNS)的性能。我们的框架是基于门控GNN层的输出,其机制是通过基础图的节点跨节点传递信息的多速率流。利用本地梯度以进一步调节消息传递更新。我们的框架灵活地允许一个人使用任何基本的GNN层作为包装器,围绕该包装器构建了多率梯度门控机构。我们严格地证明g $^2 $减轻了过度厚的问题,并允许设计深度GNN。提出了经验结果,以证明所提出的框架在各种图形学习任务(包括大规模的异性图上)实现了最新的性能。

We present Gradient Gating (G$^2$), a novel framework for improving the performance of Graph Neural Networks (GNNs). Our framework is based on gating the output of GNN layers with a mechanism for multi-rate flow of message passing information across nodes of the underlying graph. Local gradients are harnessed to further modulate message passing updates. Our framework flexibly allows one to use any basic GNN layer as a wrapper around which the multi-rate gradient gating mechanism is built. We rigorously prove that G$^2$ alleviates the oversmoothing problem and allows the design of deep GNNs. Empirical results are presented to demonstrate that the proposed framework achieves state-of-the-art performance on a variety of graph learning tasks, including on large-scale heterophilic graphs.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源