论文标题
结合梯度方法的优化推导
An optimization derivation of the method of conjugate gradients
论文作者
论文摘要
我们基于每个迭代的要求将结合梯度的方法推导出来,即每个迭代都可以最大程度地减少先前观察到的梯度跨越的空间的严格凸二次。我们没有验证搜索方向具有正确的属性,而是表明这种迭代的产生相当于生成正交梯度的生成,该梯度给出了方向和步长的描述。我们的方法给出了一种直接的方法,可以看出,共轭梯度方法的搜索方向是负标量乘以迄今为止生成的迭代元素的最小欧几里得规范的梯度。
We give a derivation of the method of conjugate gradients based on the requirement that each iterate minimizes a strictly convex quadratic on the space spanned by the previously observed gradients. Rather than verifying that the search direction has the correct properties, we show that generation of such iterates is equivalent to generation of orthogonal gradients which gives the description of the direction and the step length. Our approach gives a straightforward way to see that the search direction of the method of conjugate gradients is a negative scalar times the gradient of minimum Euclidean norm evaluated on the affine span of the iterates generated so far.