论文标题

无限制的非凸优化的随机子空间正规化牛顿方法

Randomized subspace regularized Newton method for unconstrained non-convex optimization

论文作者

Fuji, Terunari, Poirion, Pierre-Louis, Takeda, Akiko

论文摘要

尽管已经存在随机子空间牛顿方法,该方法将搜索方向限制为凸功能的随机子空间,但我们为非convex函数提出了一种随机的子空间正规化牛顿方法(并且更普遍地,我们彻底研究了随机分子空间牛顿方法的局部收敛速率}。 在我们提出的算法中,使用该函数的修改后的Hessian限制在某些随机子空间中,并且概率很高,即使目标函数为非convex,函数值也会降低。 在本文中,我们表明我们的方法在适当的假设下具有全球收敛性,其收敛率与完整的正规牛顿方法相同。 %我们还证明,在某些其他假设下,我们可以获得局部线性收敛率,并证明该速率通常是我们希望使用随机子空间时所希望的。我们进一步证明,如果本地最佳的Hessian升级为等级,那么Superlienar Convergence就会保持。

While there already exist randomized subspace Newton methods that restrict the search direction to a random subspace for a convex function, we propose a randomized subspace regularized Newton method for a non-convex function {and more generally we investigate thoroughly the local convergence rate of the randomized subspace Newton method}. In our proposed algorithm using a modified Hessian of the function restricted to some random subspace, with high probability, the function value decreases even when the objective function is non-convex. In this paper, we show that our method has global convergence under appropriate assumptions and its convergence rate is the same as that of the full regularized Newton method. %We also prove that Furthermore, we can obtain a local linear convergence rate under some additional assumptions, and prove that this rate is the best we can hope, in general, when using random subspace. We furthermore prove that if the Hessian at the local optimum is rank defficient then superlienar convergence holds.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源