论文标题

平滑度空间中具有编码权重的神经网络的近似率

Approximation Rates for Neural Networks with Encodable Weights in Smoothness Spaces

论文作者

Gühring, Ingo, Raslan, Mones

论文摘要

我们检查了神经网络的必要和足够复杂性,以在可编码网络权重的限制下近似不同平滑度空间的功能。基于熵参数,我们首先要证明BESOV空间,Sobolev空间等的神经网络近似的非零编码权重的数量。这些结果对于所有足够平滑的激活功能都是有效的。之后,我们提供了一个统一的框架,用于通过具有相当通用的激活功能的神经网络构建统一的近似分区。这使我们能够通过神经网络近似局部的泰勒多项式,并利用Bramble-Hilbert Lemma。基于我们的框架,我们在高阶Sobolev规范中得出了几乎最佳的上限。这项工作推进了通过神经网络近似偏微分方程解决方案的理论。

We examine the necessary and sufficient complexity of neural networks to approximate functions from different smoothness spaces under the restriction of encodable network weights. Based on an entropy argument, we start by proving lower bounds for the number of nonzero encodable weights for neural network approximation in Besov spaces, Sobolev spaces and more. These results are valid for all sufficiently smooth activation functions. Afterwards, we provide a unifying framework for the construction of approximate partitions of unity by neural networks with fairly general activation functions. This allows us to approximate localized Taylor polynomials by neural networks and make use of the Bramble-Hilbert Lemma. Based on our framework, we derive almost optimal upper bounds in higher-order Sobolev norms. This work advances the theory of approximating solutions of partial differential equations by neural networks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源