论文标题

低级别和稀疏增强的Tucker分解以完成张量

Low-Rank and Sparse Enhanced Tucker Decomposition for Tensor Completion

论文作者

Pan, Chenjian, Ling, Chen, He, Hongjin, Qi, Liqun, Xu, Yanwei

论文摘要

张量完成是指从不完整的测量或观察中估算丢失数据的任务,这是大数据分析,计算机视觉和网络工程领域经常引起的核心问题。由于高阶张量的多维性质,矩阵方法,例如矩阵分解和张量的直接矩阵,通常不是张量完成和恢复的理想选择。在本文中,我们引入了统一的低级别和稀疏增强的塔克分解模型,以完成张量。我们的模型具有稀疏的正则化项,可以促进塔克分解的稀疏核心张量,这对张量数据压缩是有益的。此外,我们对塔克分解的因子矩阵执行低级别的正规化项,以诱导张量的低级别,并以廉价的计算成本诱导张量。从数字上讲,我们提出了一个定制的ADMM,具有足够的简单子问题来解决基础模型。值得注意的是,我们的模型能够处理不同类型的现实数据集,因为它利用了张量中出现的潜在周期性和固有的相关属性。关于现实世界数据集(包括互联网流量数据集,颜色图像和面部识别)的一系列计算实验表明,在实现更高的恢复精度方面,我们的模型比许多现有的现有最新临床化和张力方法更好。

Tensor completion refers to the task of estimating the missing data from an incomplete measurement or observation, which is a core problem frequently arising from the areas of big data analysis, computer vision, and network engineering. Due to the multidimensional nature of high-order tensors, the matrix approaches, e.g., matrix factorization and direct matricization of tensors, are often not ideal for tensor completion and recovery. In this paper, we introduce a unified low-rank and sparse enhanced Tucker decomposition model for tensor completion. Our model possesses a sparse regularization term to promote a sparse core tensor of the Tucker decomposition, which is beneficial for tensor data compression. Moreover, we enforce low-rank regularization terms on factor matrices of the Tucker decomposition for inducing the low-rankness of the tensor with a cheap computational cost. Numerically, we propose a customized ADMM with enough easy subproblems to solve the underlying model. It is remarkable that our model is able to deal with different types of real-world data sets, since it exploits the potential periodicity and inherent correlation properties appeared in tensors. A series of computational experiments on real-world data sets, including internet traffic data sets, color images, and face recognition, demonstrate that our model performs better than many existing state-of-the-art matricization and tensorization approaches in terms of achieving higher recovery accuracy.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源