论文标题

与图聚拉普拉斯的回归收敛速率

Rates of Convergence for Regression with the Graph Poly-Laplacian

论文作者

Trillos, Nicolás García, Murray, Ryan, Thorpe, Matthew

论文摘要

在(特殊的)平滑样条问题中,一个人考虑了二次数据保真惩罚和拉普拉斯正则化的变化问题。可以通过用聚拉普拉斯的常规剂代替拉普拉斯的常规机构来获得更高的规律性。该方法很容易适应图,在这里,我们考虑在完全监督的,非参数,噪声损坏的回归问题中图形多拉普拉斯正则化。特别是,给定一个数据集$ \ {x_i \} _ {i = 1}^n $和一组嘈杂的标签$ \ {y_i \} _ {i = 1}^n \ subset \ subset \ mathbb {r} $ u_n:\ {x_i \} _ {i = 1}^n \ to \ mathbb {r} $是由数据保真度和适当缩放的图形poly-laplacian术语组成的能量的最小值。当$ y_i = g(x_i)+ξ_i$,对于IID噪声$ξ_i$,并使用几何随机图,我们(高概率)在大数据限制$ n \ n \ to \ infty $中识别$ u_n $ to $ g $的收敛速率。此外,我们的速率(到对数)与通常的平滑样条模型中已知的收敛速率相吻合。

In the (special) smoothing spline problem one considers a variational problem with a quadratic data fidelity penalty and Laplacian regularisation. Higher order regularity can be obtained via replacing the Laplacian regulariser with a poly-Laplacian regulariser. The methodology is readily adapted to graphs and here we consider graph poly-Laplacian regularisation in a fully supervised, non-parametric, noise corrupted, regression problem. In particular, given a dataset $\{x_i\}_{i=1}^n$ and a set of noisy labels $\{y_i\}_{i=1}^n\subset\mathbb{R}$ we let $u_n:\{x_i\}_{i=1}^n\to\mathbb{R}$ be the minimiser of an energy which consists of a data fidelity term and an appropriately scaled graph poly-Laplacian term. When $y_i = g(x_i)+ξ_i$, for iid noise $ξ_i$, and using the geometric random graph, we identify (with high probability) the rate of convergence of $u_n$ to $g$ in the large data limit $n\to\infty$. Furthermore, our rate, up to logarithms, coincides with the known rate of convergence in the usual smoothing spline model.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源