论文标题
道格拉斯·拉赫福德(Douglas-Rachford)分裂和非convex优化的ADMM:加速和牛顿型linesearch算法
Douglas-Rachford splitting and ADMM for nonconvex optimization: Accelerated and Newton-type linesearch algorithms
论文作者
论文摘要
尽管在小且刻度良好的问题中,诸如道格拉斯·拉赫福德(DRS)和ADMM等流行优化算法的性能是令人满意的,疾病和问题大小构成了严重的障碍。为了扩展DRS和ADMM的最新收敛结果,应用于非凸问题,我们提出了两种LinesEarch算法,以通过Quasi-Newton方向来增强和鲁棒化这些方法。所提出的算法适用于非凸问题,需要与DRS和ADMM相同的黑盒甲骨文,并维护其(随后的)收敛属性。数值证据表明,在拟议的框架中使用L-BFG会大大改善DR和ADMM的收敛性,从而使它们适应不良的条件。在限制点的规律性和非平稳性假设下,当采用了准Newton Broyden的方向时,显示了超线性收敛。
Although the performance of popular optimization algorithms such as Douglas-Rachford splitting (DRS) and the ADMM is satisfactory in small and well-scaled problems, ill conditioning and problem size pose a severe obstacle to their reliable employment. Expanding on recent convergence results for DRS and ADMM applied to nonconvex problems, we propose two linesearch algorithms to enhance and robustify these methods by means of quasi-Newton directions. The proposed algorithms are suited for nonconvex problems, require the same black-box oracle of DRS and ADMM, and maintain their (subsequential) convergence properties. Numerical evidence shows that the employment of L-BFGS in the proposed framework greatly improves convergence of DRS and ADMM, making them robust to ill conditioning. Under regularity and nondegeneracy assumptions at the limit point, superlinear convergence is shown when quasi-Newton Broyden directions are adopted.