论文标题
通过张量列车的张量完成,基于张训练的低级别几何形状在预处理下
Tensor Completion via Tensor Train Based Low-Rank Quotient Geometry under a Preconditioned Metric
论文作者
论文摘要
本文研究了低级张量的完成问题,该问题是关于从部分观察到的条目中恢复张量的。我们在张量列车格式中考虑了这个问题,并将预处理的度量从矩阵情况扩展到张量。详细研究了该指标下的固定张量列表张量张量的一阶和二阶的几何形状。算法,包括Riemannian梯度下降,Riemannian结合梯度和Riemannian Gauss-Newton,已根据商的几何形状提出了张量的完成问题。还已经表明,商几何形状上的Riemannian Gauss-Newton方法等于具有特定缩回的嵌入式几何形状上的Riemannian Gauss-Newton方法。关于随机实例以及与功能相关的张量的经验评估表明,在恢复能力,收敛性能和重建质量方面,所提出的算法与其他现有算法具有竞争力。
This paper investigates the low-rank tensor completion problem, which is about recovering a tensor from partially observed entries. We consider this problem in the tensor train format and extend the preconditioned metric from the matrix case to the tensor case. The first-order and second-order quotient geometry of the manifold of fixed tensor train rank tensors under this metric is studied in detail. Algorithms, including Riemannian gradient descent, Riemannian conjugate gradient, and Riemannian Gauss-Newton, have been proposed for the tensor completion problem based on the quotient geometry. It has also been shown that the Riemannian Gauss-Newton method on the quotient geometry is equivalent to the Riemannian Gauss-Newton method on the embedded geometry with a specific retraction. Empirical evaluations on random instances as well as on function-related tensors show that the proposed algorithms are competitive with other existing algorithms in terms of recovery ability, convergence performance, and reconstruction quality.