论文标题
张量张量张量张力张量
Tensor Network States with Low-Rank Tensors
论文作者
论文摘要
张量网络用于有效地近似强相关的量子多体系统的状态。更一般而言,张量网络近似可能使在订单中运行的成本从$ n $中的指数降低到多项式,这已成为机器学习的一种流行方法。我们介绍了对组成张量网络的张量施加低级别约束的想法。通过这种修改,可以在保持高精度的同时大大降低网络优化的时间和空间复杂性。我们详细介绍了树木张量网络状态(TTN)和预测的纠缠对准状态的想法。较低级别TTN的开谷树上的自旋模型的模拟例证了等级约束对表达能力的影响。我们发现,选择张量排名$ r $按债券尺寸$ m $的顺序,足以获得高智能地基地近似值,并且基本上超过了标准的TTNS计算。因此,低量张量网络是模拟大型数据集上量子问题和机器学习的有前途的途径。
Tensor networks are used to efficiently approximate states of strongly-correlated quantum many-body systems. More generally, tensor network approximations may allow to reduce the costs for operating on an order-$N$ tensor from exponential to polynomial in $N$, and this has become a popular approach for machine learning. We introduce the idea of imposing low-rank constraints on the tensors that compose the tensor network. With this modification, the time and space complexities for the network optimization can be substantially reduced while maintaining high accuracy. We detail this idea for tree tensor network states (TTNS) and projected entangled-pair states. Simulations of spin models on Cayley trees with low-rank TTNS exemplify the effect of rank constraints on the expressive power. We find that choosing the tensor rank $r$ to be on the order of the bond dimension $m$, is sufficient to obtain high-accuracy groundstate approximations and to substantially outperform standard TTNS computations. Thus low-rank tensor networks are a promising route for the simulation of quantum matter and machine learning on large data sets.