论文标题

可扩展的,自适应和声音的非凸正规器,用于低级矩阵完成

A Scalable, Adaptive and Sound Nonconvex Regularizer for Low-rank Matrix Completion

论文作者

Wang, Yaqing, Yao, Quanming, Kwok, James T.

论文摘要

矩阵学习是许多机器学习问题的核心。许多真实的应用程序,例如协作过滤和文本挖掘 可以作为低级矩阵完成问题配方,该问题可以使用低级别的假设恢复不完整的矩阵。为了确保矩阵解决方案的排名较低,最近的趋势是使用适应性惩罚奇异值的非convex正规化器。它们提供了良好的恢复性能,并且具有良好的理论属性,但是由于反复访问单个单数值,因此在计算上昂贵。在本文中,基于自适应收缩奇异值的关键见解改善了经验性能,我们提出了一种新的NonnonConvex低级别正规剂,称为“核Norm Norm Minus norus frobenius Norm”正规剂,可扩展,自适应和声音。我们首先证明它具有自适应收缩率。此外,我们发现了其有偏见的形式,该形式绕过了奇异值的计算,并允许通过一般优化算法进行快速优化。保证稳定的恢复和收敛。对许多合成和现实世界数据集的广泛的低级矩阵完成实验表明,与现有的低级别矩阵学习方法相比,该方法获得了最先进的恢复性能,同时是最快的恢复性能。

Matrix learning is at the core of many machine learning problems. A number of real-world applications such as collaborative filtering and text mining can be formulated as a low-rank matrix completion problem, which recovers incomplete matrix using low-rank assumptions. To ensure that the matrix solution has a low rank, a recent trend is to use nonconvex regularizers that adaptively penalize singular values. They offer good recovery performance and have nice theoretical properties, but are computationally expensive due to repeated access to individual singular values. In this paper, based on the key insight that adaptive shrinkage on singular values improve empirical performance, we propose a new nonconvex low-rank regularizer called "nuclear norm minus Frobenius norm" regularizer, which is scalable, adaptive and sound. We first show it provably holds the adaptive shrinkage property. Further, we discover its factored form which bypasses the computation of singular values and allows fast optimization by general optimization algorithms. Stable recovery and convergence are guaranteed. Extensive low-rank matrix completion experiments on a number of synthetic and real-world data sets show that the proposed method obtains state-of-the-art recovery performance while being the fastest in comparison to existing low-rank matrix learning methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源