论文标题

Chebyshev惯性迭代,以加速定点迭代

Chebyshev Inertial Iteration for Accelerating Fixed-Point Iterations

论文作者

Wadayama, Tadashi, Takabe, Satoshi

论文摘要

提出了一种新的方法,称为Chebyshev惯性迭代,用于加速定点迭代的收敛速度。 Chebyshev的惯性迭代可以被视为连续的放松或Krasnosel'ski \Vı-Mann迭代,利用Chebyshev多项式作为迭代依赖性惯性因素的根源的倒数。提出的方法的最显着特征之一是,除线性固定点迭代外,它还可以应用于非线性定点迭代。固定点周围的线性化是对所提出方法的局部收敛速率进行分析的关键。所提出的方法似乎有效,特别是加速近端梯度方法,例如ISTA。还证明,如果固定点上的雅各布式的所有特征值是真实的,则提出的方法几乎可以成功加速任何固定点迭代。

A novel method which is called the Chebyshev inertial iteration for accelerating the convergence speed of fixed-point iterations is presented. The Chebyshev inertial iteration can be regarded as a valiant of the successive over relaxation or Krasnosel'ski\vı-Mann iteration utilizing the inverse of roots of a Chebyshev polynomial as iteration dependent inertial factors. One of the most notable features of the proposed method is that it can be applied to nonlinear fixed-point iterations in addition to linear fixed-point iterations. Linearization around the fixed point is the key for the analysis on the local convergence rate of the proposed method. The proposed method appears effective in particular for accelerating the proximal gradient methods such as ISTA. It is also proved that the proposed method can successfully accelerate almost any fixed-point iterations if all the eigenvalues of the Jacobian at the fixed point are real.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源