论文标题

riemannian随机近端梯度方法,用于在Stiefel歧管上进行非滑动优化

Riemannian Stochastic Proximal Gradient Methods for Nonsmooth Optimization over the Stiefel Manifold

论文作者

Wang, Bokun, Ma, Shiqian, Xue, Lingzhou

论文摘要

Riemannian优化由于其实践中广泛的应用而引起了很多关注。文献中已经研究了Riemannian随机的一阶算法,以解决Riemannian歧管上的大规模机器学习问题。但是,大多数现有的Riemannian随机算法都要求目标函数可区分,并且它们不适用于目标函数非平滑的情况。在本文中,我们提出了两种riemannian随机近端梯度方法,用于最大程度地减少在齿状歧管上的非平滑函数。这两种方法称为r-proxsgd和r-proxspb,是欧几里得设置中近端SGD和近端蜘蛛杆的概括。提供了提出算法的增量一阶甲骨文(IFO)复杂性的分析。具体而言,R-ProxSPB算法在在线情况下找到了一个$ε$ - 定位点,其中$Ø(ε^{ - 3})$ ifos和$Ø(n+\ sqrt {n} n}ε^{ - 2})$在$ n $ n $ n $ n norke norke norkemess of Accomps of Accomps of Accomps of Accomps of Accomps of Accomps of Accomps of Accomps in Accomps in Accomps in Accomps in Accomps in Accomps in Accompys中。在线稀疏PCA和健壮的低级矩阵完成的实验结果表明,我们所提出的方法的表现大大优于使用Riemannian亚级别信息的现有方法。

Riemannian optimization has drawn a lot of attention due to its wide applications in practice. Riemannian stochastic first-order algorithms have been studied in the literature to solve large-scale machine learning problems over Riemannian manifolds. However, most of the existing Riemannian stochastic algorithms require the objective function to be differentiable, and they do not apply to the case where the objective function is nonsmooth. In this paper, we present two Riemannian stochastic proximal gradient methods for minimizing nonsmooth function over the Stiefel manifold. The two methods, named R-ProxSGD and R-ProxSPB, are generalizations of proximal SGD and proximal SpiderBoost in Euclidean setting to the Riemannian setting. Analysis on the incremental first-order oracle (IFO) complexity of the proposed algorithms is provided. Specifically, the R-ProxSPB algorithm finds an $ε$-stationary point with $Ø(ε^{-3})$ IFOs in the online case, and $Ø(n+\sqrt{n}ε^{-2})$ IFOs in the finite-sum case with $n$ being the number of summands in the objective. Experimental results on online sparse PCA and robust low-rank matrix completion show that our proposed methods significantly outperform the existing methods that use Riemannian subgradient information.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源