论文标题
非精液正规化近端牛顿方法,用于非凸和非滑动优化
An inexact regularized proximal Newton method for nonconvex and nonsmooth optimization
论文作者
论文摘要
本文着重于最小化两次连续可区分函数$ f $和非平滑凸功能的总和。通过涉及KKT残留物的$ \ varrho $ th功率的Hessian近似,提出了一种不确定的正规化近端牛顿方法。对于$ \ varrho = 0 $,我们证明了KL目标函数的迭代序列的全局收敛及其R-Linear收敛速率的KL目标函数$ 1/2 $。对于$ \ varrho \ in(0,1)$,假设集群点满足了二阶固定点集合的局部hölderian错误限制的订单$ q $的限制,并且订单$ q>>> 1 \!+\!+\!\!\!\ varrho $的本地误差在共同的固定点设置上,我们分别建立$ contrende us $ contrending $ contrending under underience undere $ quonting $ quonting和q contrending $ q contrendend use quance underial $ Q $ \ varrho $。还开发了一种双重半齿牛顿增强拉格朗日方法,以寻求一种不可绝经的子问题。与$ \ ell_1 $登记的学生的$ t $ Regressions,“小组惩罚学生的$ t $ Regressions”和NonConvex Image Restoration上的两种最先进方法的数值比较证实了所提出方法的效率。
This paper focuses on the minimization of a sum of a twice continuously differentiable function $f$ and a nonsmooth convex function. An inexact regularized proximal Newton method is proposed by an approximation to the Hessian of $f$ involving the $\varrho$th power of the KKT residual. For $\varrho=0$, we justify the global convergence of the iterate sequence for the KL objective function and its R-linear convergence rate for the KL objective function of exponent $1/2$. For $\varrho\in(0,1)$, by assuming that cluster points satisfy a locally Hölderian error bound of order $q$ on a second-order stationary point set and a local error bound of order $q>1\!+\!\varrho$ on the common stationary point set, respectively, we establish the global convergence of the iterate sequence and its superlinear convergence rate with order depending on $q$ and $\varrho$. A dual semismooth Newton augmented Lagrangian method is also developed for seeking an inexact minimizer of subproblems. Numerical comparisons with two state-of-the-art methods on $\ell_1$-regularized Student's $t$-regressions, group penalized Student's $t$-regressions, and nonconvex image restoration confirm the efficiency of the proposed method.