论文标题
双重主要组件追求和正交词典学习的歧管近端算法
Manifold Proximal Point Algorithms for Dual Principal Component Pursuit and Orthogonal Dictionary Learning
论文作者
论文摘要
我们考虑了在球体上最大化线性图的$ \ ell_1 $规范的问题,这是在各种机器学习应用程序中产生的,例如正交词典学习(ODL)和强大的子空间恢复(RSR)。由于其非平滑目标和非凸的约束,该问题在数值上具有挑战性,并且其算法方面尚未得到很好的探索。在本文中,我们展示了如何利用球体的多种结构来设计用于解决此问题的快速算法。具体来说,我们的贡献是三倍。首先,我们提出了一个问题的近近端算法(MANPPA),并表明它以sublinear速率收敛。此外,我们表明,将Manppa应用于ODL和RSR问题时可以达到二次收敛率。其次,我们提出了一种称为Stmanppa的Manppa的随机变体,该变体非常适合大规模计算,并建立了其均方根收敛速率。事实证明,Manppa和Stmanppa的收敛速率都比现有的亚级别型方法更快。第三,使用Manppa作为构建块,我们提出了一种解决问题的基质类似物的新方法,其中球体被Stiefel歧管所取代。我们关于ODL和RSR问题的广泛数值实验的结果证明了我们提出的方法的效率和功效。
We consider the problem of maximizing the $\ell_1$ norm of a linear map over the sphere, which arises in various machine learning applications such as orthogonal dictionary learning (ODL) and robust subspace recovery (RSR). The problem is numerically challenging due to its nonsmooth objective and nonconvex constraint, and its algorithmic aspects have not been well explored. In this paper, we show how the manifold structure of the sphere can be exploited to design fast algorithms for tackling this problem. Specifically, our contribution is threefold. First, we present a manifold proximal point algorithm (ManPPA) for the problem and show that it converges at a sublinear rate. Furthermore, we show that ManPPA can achieve a quadratic convergence rate when applied to the ODL and RSR problems. Second, we propose a stochastic variant of ManPPA called StManPPA, which is well suited for large-scale computation, and establish its sublinear convergence rate. Both ManPPA and StManPPA have provably faster convergence rates than existing subgradient-type methods. Third, using ManPPA as a building block, we propose a new approach to solving a matrix analog of the problem, in which the sphere is replaced by the Stiefel manifold. The results from our extensive numerical experiments on the ODL and RSR problems demonstrate the efficiency and efficacy of our proposed methods.