论文标题
通过普遍成立的粒子群优化的自适应潜在因子分析
Adaptive Latent Factor Analysis via Generalized Momentum-Incorporated Particle Swarm Optimization
论文作者
论文摘要
随机梯度下降(SGD)算法是在高维和不完整(HDI)矩阵上建立潜在因子分析(LFA)模型的有效学习策略。通常采用粒子群优化(PSO)算法来制造基于SGD的LFA模型的超参数,即学习率和正则化系数,自我适应。但是,标准的PSO算法可能会遭受由过早收敛引起的准确损失。为了解决这个问题,本文将更多的历史信息纳入了每个粒子的进化过程中,以避免遵循广义摩托明(GM)方法的原理过早收敛,从而创新了新型的GM合并PSO(GM-PSO)。借助它,进一步实现了基于GM-PSO的LFA(GMPL)模型,以实现高参数的有效自适应。三个HDI矩阵的实验结果表明,GMPL模型可以实现较高的预测准确性,用于缺少工业应用中的数据估计。
Stochastic gradient descent (SGD) algorithm is an effective learning strategy to build a latent factor analysis (LFA) model on a high-dimensional and incomplete (HDI) matrix. A particle swarm optimization (PSO) algorithm is commonly adopted to make an SGD-based LFA model's hyper-parameters, i.e, learning rate and regularization coefficient, self-adaptation. However, a standard PSO algorithm may suffer from accuracy loss caused by premature convergence. To address this issue, this paper incorporates more historical information into each particle's evolutionary process for avoiding premature convergence following the principle of a generalized-momentum (GM) method, thereby innovatively achieving a novel GM-incorporated PSO (GM-PSO). With it, a GM-PSO-based LFA (GMPL) model is further achieved to implement efficient self-adaptation of hyper-parameters. The experimental results on three HDI matrices demonstrate that the GMPL model achieves a higher prediction accuracy for missing data estimation in industrial applications.