论文标题

在时间图上的归纳表示学习

Inductive Representation Learning on Temporal Graphs

论文作者

Xu, Da, Ruan, Chuanwei, Korpeoglu, Evren, Kumar, Sushant, Achan, Kannan

论文摘要

在时间图上的归纳表示学习是迈向现实世界动态网络上可销售的机器学习的重要一步。时间动态图的不断发展的性质需要处理新节点以及捕获时间模式。现在是时间功能的节点嵌入,应既代表静态节点特征和不断发展的拓扑结构。此外,节点和拓扑特征也可以是暂时的,它们的图案也应捕获节点嵌入。我们提出了时间图(TGAT)层,以有效地汇总了时间 - 邻域特征以及学习时间功能相互作用。对于TGAT,我们将自我发挥机制用作构建障碍,并根据谐波分析从经典的Bochner定理开发一种新型的功能时间编码技术。通过堆叠TGAT层,网络将节点嵌入识别为时间的函数,并能够将新的和观察到的节点随着图表的发展而诱导地推断出嵌入。所提出的方法处理节点分类和链接预测任务,并且可以自然扩展到包括时间边缘功能。我们在具有两个基准和一个工业数据集的时间设置下,使用偏置和归纳任务评估我们的方法。我们的TGAT模型与最先进的基线以及以前的时间图嵌入方法相比。

Inductive representation learning on temporal graphs is an important step toward salable machine learning on real-world dynamic networks. The evolving nature of temporal dynamic graphs requires handling new nodes as well as capturing temporal patterns. The node embeddings, which are now functions of time, should represent both the static node features and the evolving topological structures. Moreover, node and topological features can be temporal as well, whose patterns the node embeddings should also capture. We propose the temporal graph attention (TGAT) layer to efficiently aggregate temporal-topological neighborhood features as well as to learn the time-feature interactions. For TGAT, we use the self-attention mechanism as building block and develop a novel functional time encoding technique based on the classical Bochner's theorem from harmonic analysis. By stacking TGAT layers, the network recognizes the node embeddings as functions of time and is able to inductively infer embeddings for both new and observed nodes as the graph evolves. The proposed approach handles both node classification and link prediction task, and can be naturally extended to include the temporal edge features. We evaluate our method with transductive and inductive tasks under temporal settings with two benchmark and one industrial dataset. Our TGAT model compares favorably to state-of-the-art baselines as well as the previous temporal graph embedding approaches.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源