论文标题

Nesterov和Polyak的固定步骤方法的迭代复杂性用于凸二次函数

Iteration Complexity of Fixed-Step Methods by Nesterov and Polyak for Convex Quadratic Functions

论文作者

Hagedorn, Melinda, Jarre, Florian

论文摘要

本说明考虑了Polyak的动量方法和Nesterov的加速梯度方法,既没有线路搜索,又有固定的步长,但固定的步长将其应用于严格凸出的二次函数,假设使用了精确的梯度,并且已知的Hessian Matrix的极端特征值的上限和下限是已知的。简单的2-D示例表明,迭代到最佳溶液的欧几里得距离是非单调的。在这种情况下,在确保将欧几里得距离降低到最佳解决方案所需的迭代次数上是明确的界限。对于这两种方法,界限都是最佳的,直到一个恒定因素,它在动量方法上均不相互补充最佳的最佳结果,并且它建立了动量方法的另一个链接和Nesterov的加速梯度方法。

This note considers the momentum method by Polyak and the accelerated gradient method by Nesterov, both without line search but with fixed step length applied to strictly convex quadratic functions assuming that exact gradients are used and appropriate upper and lower bounds for the extreme eigenvalues of the Hessian matrix are known. Simple 2-d-examples show that the Euclidean distance of the iterates to the optimal solution is non-monotone. In this context an explicit bound is derived on the number of iterations needed to guarantee a reduction of the Euclidean distance to the optimal solution by a factor $\varepsilon$. For both methods the bound is optimal up to a constant factor, it complements earlier asymptotically optimal results for the momentum method, and it establishes another link of the momentum method and Nesterov's accelerated gradient method.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源