论文标题

(非)凸优化的一般高级大型最小化算法的系统方法

A systematic approach to general higher-order majorization-minimization algorithms for (non)convex optimization

论文作者

Necoara, Ion, Lupu, Daniela

论文摘要

大型化最小化算法包括将目标函数的上限序列延长最小化,从而沿迭代效果降低。这样一个简单的原理允许解决大量优化问题,甚至是非凸和非平滑的。我们提出了一个一般的高阶大型最小化算法框架,用于最大程度地减少近似值(替代)的目标函数,以使相应的误差函数具有高阶Lipschitz连续导数。我们为我们的新方法提供了融合保证,用于(非)凸和/或(非)平稳目标函数的一般优化问题。对于凸(可能是非滑动)问题,我们提供了全局均方根收敛率,而对于均匀凸目标函数的问题,我们获得了局部更快的超线性收敛速率。我们还证明了全球固定点保证一般非凸(可能是非平滑)问题,在目标函数的Kurdyka-lojasiewicz属性下,我们得出了从sublinear到超线化算法的局部收敛速率。此外,对于无约束的非凸问题,我们根据一阶和二阶最佳条件得出收敛速度。

Majorization-minimization algorithms consist of successively minimizing a sequence of upper bounds of the objective function so that along the iterations the objective function decreases. Such a simple principle allows to solve a large class of optimization problems, even nonconvex and nonsmooth. We propose a general higher-order majorization-minimization algorithmic framework for minimizing an objective function that admits an approximation (surrogate) such that the corresponding error function has a higher-order Lipschitz continuous derivative. We present convergence guarantees for our new method for general optimization problems with (non)convex and/or (non)smooth objective function. For convex (possibly nonsmooth) problems we provide global sublinear convergence rates, while for problems with uniformly convex objective function we obtain locally faster superlinear convergence rates. We also prove global stationary point guarantees for general nonconvex (possibly nonsmooth) problems and under Kurdyka-Lojasiewicz property of the objective function we derive local convergence rates ranging from sublinear to superlinear for our majorization-minimization algorithm. Moreover, for unconstrained nonconvex problems we derive convergence rates in terms of first- and second-order optimality conditions.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源