论文标题

改进了二次功能的SVRG

Improved SVRG for quadratic functions

论文作者

Kahale, Nabil

论文摘要

我们分析了一种迭代算法,以最大程度地减少其Hessian Matrix $ h $的二次功能,这是对随机对称$ d \ times d $矩阵的期望。该算法是随机方差降低梯度(SVRG)的变体。在几种应用程序中,包括最小二乘回归,脊回归,线性判别分析和正规线性判别分析,每次迭代的运行时间与$ d $成正比。在平滑度和凸条件下,该算法具有线性收敛。当应用于二次函数时,我们的分析将SVRG的最新性能提高到对数因素。此外,对于条件良好的二次问题,我们的分析改善了加速SVRG的最新运行时间,并且比对数因素比已知的匹配下限更好。我们的理论结果以数值实验支持。

We analyse an iterative algorithm to minimize quadratic functions whose Hessian matrix $H$ is the expectation of a random symmetric $d\times d$ matrix. The algorithm is a variant of the stochastic variance reduced gradient (SVRG). In several applications, including least-squares regressions, ridge regressions, linear discriminant analysis and regularized linear discriminant analysis, the running time of each iteration is proportional to $d$. Under smoothness and convexity conditions, the algorithm has linear convergence. When applied to quadratic functions, our analysis improves the state-of-the-art performance of SVRG up to a logarithmic factor. Furthermore, for well-conditioned quadratic problems, our analysis improves the state-of-the-art running times of accelerated SVRG, and is better than the known matching lower bound, by a logarithmic factor. Our theoretical results are backed with numerical experiments.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源