论文标题
封闭图的复发神经网络
Gated Graph Recurrent Neural Networks
论文作者
论文摘要
图过程表现出由序列指数和图形支持确定的空间结构确定的时间结构。要从图表过程中学习,信息处理体系结构必须能够利用这两个基础结构。我们将图形复发性神经网络(GRNN)介绍为一般学习框架,通过利用复发性隐藏状态以及图形信号处理(GSP)的概念来实现这一目标。在GRNN中,可学习参数的数量与序列的长度和图形大小无关,从而确保可伸缩性。我们证明Grnn是置换术的,并且它们在基础图支持的扰动中稳定。为了解决消失梯度的问题,我们还提出了具有三种不同门控机制的封闭式grnn:时间,节点和边缘门。在涉及合成数据集的数值实验中,显示时间门控的GRNN可改善长期依赖性问题的GRNN,而节点和边缘门有助于编码图表中存在的长距离依赖关系。数值结果还表明,GRNN的表现优于GNN和RNN,强调了考虑图形过程的时间和图形结构的重要性。
Graph processes exhibit a temporal structure determined by the sequence index and and a spatial structure determined by the graph support. To learn from graph processes, an information processing architecture must then be able to exploit both underlying structures. We introduce Graph Recurrent Neural Networks (GRNNs) as a general learning framework that achieves this goal by leveraging the notion of a recurrent hidden state together with graph signal processing (GSP). In the GRNN, the number of learnable parameters is independent of the length of the sequence and of the size of the graph, guaranteeing scalability. We prove that GRNNs are permutation equivariant and that they are stable to perturbations of the underlying graph support. To address the problem of vanishing gradients, we also put forward gated GRNNs with three different gating mechanisms: time, node and edge gates. In numerical experiments involving both synthetic and real datasets, time-gated GRNNs are shown to improve upon GRNNs in problems with long term dependencies, while node and edge gates help encode long range dependencies present in the graph. The numerical results also show that GRNNs outperform GNNs and RNNs, highlighting the importance of taking both the temporal and graph structures of a graph process into account.