论文标题

Adagrid:自适应网格搜索链接预测培训目标

AdaGrid: Adaptive Grid Search for Link Prediction Training Objective

论文作者

Poštuvan, Tim, You, Jiaxuan, Banaei, Mohammadreza, Lebret, Rémi, Leskovec, Jure

论文摘要

有助于机器学习模型成功的最重要因素之一是一个良好的培训目标。培训目标对模型的性能和泛化功能至关重要。本文特别关注链接预测的图形神经网络培训目标,这在现有文献中尚未探讨。在这里,培训目标包括负面的采样策略和各种超参数,例如边缘消息比率,它控制了训练边缘的使用方式。通常,这些超参数通过完整的网格搜索进行了微调,这非常耗时且依赖于模型。为了减轻这些限制,我们提出了自适应网格搜索(ADAGRID),该网格搜索在训练过程中动态调整边缘消息比。它是模型不可知论,具有完全可定制的计算预算的高度扩展。通过广泛的实验,我们表明Adagrid可以提高模型的性能高达$ 1.9 \%$,而比完整搜索的时间效率高9倍。总体而言,Adagrid代表了一种用于设计机器学习培训目标的有效自动化算法。

One of the most important factors that contribute to the success of a machine learning model is a good training objective. Training objective crucially influences the model's performance and generalization capabilities. This paper specifically focuses on graph neural network training objective for link prediction, which has not been explored in the existing literature. Here, the training objective includes, among others, a negative sampling strategy, and various hyperparameters, such as edge message ratio which controls how training edges are used. Commonly, these hyperparameters are fine-tuned by complete grid search, which is very time-consuming and model-dependent. To mitigate these limitations, we propose Adaptive Grid Search (AdaGrid), which dynamically adjusts the edge message ratio during training. It is model agnostic and highly scalable with a fully customizable computational budget. Through extensive experiments, we show that AdaGrid can boost the performance of the models up to $1.9\%$ while being nine times more time-efficient than a complete search. Overall, AdaGrid represents an effective automated algorithm for designing machine learning training objectives.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源