论文标题
局部最小二乘近似的收敛边界
Convergence bounds for local least squares approximation
论文作者
论文摘要
我们考虑在$ l^2 $的一般非线性子集中近似功能的问题,当只能计算出$ l^2 $ norm的加权蒙特卡洛估计值时。在这种情况下特别感兴趣的是样本复杂性的概念,即以高概率实现规定误差所需的样本点的数量。此数量的合理最坏情况仅适用于特定模型类,例如线性空间或一组稀疏向量。对于更通用的集合,例如张量网络或神经网络,当前现有的界限非常悲观。通过将模型类限制为最佳近似邻域,我们可以为样本复杂性提供改进的最差案例边界。当考虑的邻域是具有正局部覆盖范围的歧管时,可以通过切线和正常空间的样品复杂性以及歧管的曲率来估计其样品复杂性。
We consider the problem of approximating a function in a general nonlinear subset of $L^2$, when only a weighted Monte Carlo estimate of the $L^2$-norm can be computed. Of particular interest in this setting is the concept of sample complexity, the number of sample points that are necessary to achieve a prescribed error with high probability. Reasonable worst-case bounds for this quantity exist only for particular model classes, like linear spaces or sets of sparse vectors. For more general sets, like tensor networks or neural networks, the currently existing bounds are very pessimistic. By restricting the model class to a neighbourhood of the best approximation, we can derive improved worst-case bounds for the sample complexity. When the considered neighbourhood is a manifold with positive local reach, its sample complexity can be estimated by means of the sample complexities of the tangent and normal spaces and the manifold's curvature.