论文标题
非精液降低了非convex优化的梯度方法
Inexact reduced gradient methods in nonconvex optimization
论文作者
论文摘要
本文提出并开发了具有不精确梯度信息的新线路搜索方法,以在有限维空间上查找非凸的固定点。纳入了广泛的线条搜索方法的一些抽象收敛结果。提出了一种不精确降低方法(IRG)方法的一般方案,其中梯度近似中的误差会自动适应精确梯度的幅度。迭代序列显示在采用不同的步骤选择时获得固定的累积点。在Kurdyka-Lojasiewicz属性下建立了开发IRG方法的构造收敛速率的收敛结果。通过鼓励数值实验证实了IRG方法获得的结果,这证明了IRG方法中自动控制错误的优势,而不是其他常用的错误选择。
This paper proposes and develops new linesearch methods with inexact gradient information for finding stationary points of nonconvex continuously differentiable functions on finite-dimensional spaces. Some abstract convergence results for a broad class of linesearch methods are stablished. A general scheme for inexact reduced gradient (IRG) methods is proposed, where the errors in the gradient approximation automatically adapt with the magnitudes of the exact gradients. The sequences of iterations are shown to obtain stationary accumulation points when different stepsize selections are employed. Convergence results with constructive convergence rates for the developed IRG methods are established under the Kurdyka- Lojasiewicz property. The obtained results for the IRG methods are confirmed by encouraging numerical experiments, which demonstrate advantages of automatically controlled errors in IRG methods over other frequently used error selections.