论文标题

储层计算中的渐近稳定性

Asymptotic Stability in Reservoir Computing

论文作者

Dong, Jonathan, Börve, Erik, Rafayelyan, Mushegh, Unser, Michael

论文摘要

储层计算是一类复发性神经网络,其内部权重随机固定。稳定性与网络状态对扰动的敏感性有关。它是储层计算中的重要属性,因为它直接影响性能。在实践中,希望保持在稳定的政权中,在这种政权下,扰动的效果不会呈指数式爆炸,而是靠近储层动力学丰富的混乱边境。今天,关于输入正则化和不连续激活功能的开放问题仍然存在。在这项工作中,我们使用反复的内核极限来了解储层计算中稳定性的新见解。该极限对应于大型储层尺寸,并且已经与几百个神经元的水库相关。我们获得了稳定性和混乱之间边界的定量表征,这可以极大地有益于高参数调整。从广义上讲,我们的结果有助于理解复发性神经网络的复杂动态。

Reservoir Computing is a class of Recurrent Neural Networks with internal weights fixed at random. Stability relates to the sensitivity of the network state to perturbations. It is an important property in Reservoir Computing as it directly impacts performance. In practice, it is desirable to stay in a stable regime, where the effect of perturbations does not explode exponentially, but also close to the chaotic frontier where reservoir dynamics are rich. Open questions remain today regarding input regularization and discontinuous activation functions. In this work, we use the recurrent kernel limit to draw new insights on stability in reservoir computing. This limit corresponds to large reservoir sizes, and it already becomes relevant for reservoirs with a few hundred neurons. We obtain a quantitative characterization of the frontier between stability and chaos, which can greatly benefit hyperparameter tuning. In a broader sense, our results contribute to understanding the complex dynamics of Recurrent Neural Networks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源