论文标题
深度张量分解中的多项式生长的隐式正则化
Implicit Regularization with Polynomial Growth in Deep Tensor Factorization
论文作者
论文摘要
我们研究了深度学习在张量分解中的隐式正则作用。虽然通过线性和某些类型的非线性神经网络在深矩阵和“浅”张量分解中的隐式正则化促进了低级别的溶液,但我们表明,其在深张量分解中的影响随着网络深度而多样地增长。这为观察到的实验行为提供了非常忠实的描述。使用数值实验,我们证明了这种隐式正则化在得出更准确估计和更好收敛属性方面的好处。
We study the implicit regularization effects of deep learning in tensor factorization. While implicit regularization in deep matrix and 'shallow' tensor factorization via linear and certain type of non-linear neural networks promotes low-rank solutions with at most quadratic growth, we show that its effect in deep tensor factorization grows polynomially with the depth of the network. This provides a remarkably faithful description of the observed experimental behaviour. Using numerical experiments, we demonstrate the benefits of this implicit regularization in yielding a more accurate estimation and better convergence properties.