论文标题

具有阈值和RELU激活的神经网络的记忆能力

Memory capacity of neural networks with threshold and ReLU activations

论文作者

Vershynin, Roman

论文摘要

压倒性的理论和经验证据表明,与培训数据的大小相比,与培训数据的大小相比,温和的过度兼容神经网络通常能够以$ 100 \%$ $的精度记住培训数据。对于具有Sigmoid激活函数的网络以及最近的Relu激活,这是严格证明的。在解决1988年的鲍姆(Baum)的一个开放问题时,我们证明了这种现象适用于一般多层感知,即具有阈值激活函数的神经网络,或任何阈值和relu激活的混合物。我们的构造是概率的,并利用了稀疏性。

Overwhelming theoretical and empirical evidence shows that mildly overparametrized neural networks -- those with more connections than the size of the training data -- are often able to memorize the training data with $100\%$ accuracy. This was rigorously proved for networks with sigmoid activation functions and, very recently, for ReLU activations. Addressing a 1988 open question of Baum, we prove that this phenomenon holds for general multilayered perceptrons, i.e. neural networks with threshold activation functions, or with any mix of threshold and ReLU activations. Our construction is probabilistic and exploits sparsity.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源