论文标题
具有本地连接的Relu网络的通用函数近似值的错误估计值
Error estimate for a universal function approximator of ReLU network with a local connection
论文作者
论文摘要
神经网络在各种任务中表现出很高的成功性能,但是需要进一步的研究来提高其性能。我们比具有完整连接的特定神经网络体系结构的特定神经网络体系结构的近似误差和更高的应用程序分析,因为局部连接的网络可用于解释各种神经网络,例如CNN。我们的错误估计取决于两个参数:一个控制隐藏层的深度,另一个控制隐藏层的宽度。
Neural networks have shown high successful performance in a wide range of tasks, but further studies are needed to improve its performance. We analyze the approximation error of the specific neural network architecture with a local connection and higher application than one with the full connection because the local-connected network can be used to explain diverse neural networks such as CNNs. Our error estimate depends on two parameters: one controlling the depth of the hidden layer, and the other, the width of the hidden layers.