论文标题

特征问题的迭代方法的进入

Entrywise convergence of iterative methods for eigenproblems

论文作者

Charisopoulos, Vasileios, Benson, Austin R., Damle, Anil

论文摘要

机器学习,统计和其他领域的几个问题依赖于计算特征向量。对于大规模的问题,这些特征向量的计算通常是通过迭代方案(例如子空间迭代或Krylov方法)执行的。尽管对频谱规范有针对子空间融合的经典和全面分析,但在许多现代应用中,其他子空间距离的概念更合适。最近的理论工作集中在$ \ ell_ {2 \ to \ infty} $ norm中测量的子空间的扰动上,但不考虑特征向量的实际计算。在这里,当在$ \ ell_ {2 \ to \ infty} $ norm中测量距离时,我们解决了子空间迭代的收敛性并提供确定性界限。我们通过实用的停止标准对分析进行补充,并通过数值实验证明其适用性。我们的结果表明,一个人可以在下游任务上获得可比的性能,同时需要更少的迭代,从而节省大量的计算时间。

Several problems in machine learning, statistics, and other fields rely on computing eigenvectors. For large scale problems, the computation of these eigenvectors is typically performed via iterative schemes such as subspace iteration or Krylov methods. While there is classical and comprehensive analysis for subspace convergence guarantees with respect to the spectral norm, in many modern applications other notions of subspace distance are more appropriate. Recent theoretical work has focused on perturbations of subspaces measured in the $\ell_{2 \to \infty}$ norm, but does not consider the actual computation of eigenvectors. Here we address the convergence of subspace iteration when distances are measured in the $\ell_{2 \to \infty}$ norm and provide deterministic bounds. We complement our analysis with a practical stopping criterion and demonstrate its applicability via numerical experiments. Our results show that one can get comparable performance on downstream tasks while requiring fewer iterations, thereby saving substantial computational time.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源