论文标题

神经生物学和机器学习中的大型关联记忆问题

Large Associative Memory Problem in Neurobiology and Machine Learning

论文作者

Krotov, Dmitry, Hopfield, John

论文摘要

密集的关联记忆或现代的Hopfield网络允许存储和可靠的检索指数较大的记忆数(在特征空间的维度)。同时,它们的天真实施是非生物学的,因为它似乎需要神经元之间的多体突触连接。我们表明,这些模型是对具有额外(隐藏的)神经元的更微观(根据生物学程度的)理论的有效描述,仅需要它们之间的两种相互作用。因此,我们提出的微观理论是具有一定程度的生物学合理性的大型关联记忆的有效模型。我们网络的动力学及其降低的尺寸等效均最小化能量(Lyapunov)函数。当从我们的微观理论中整合了某些动力学变量(隐藏的神经元)时,可以恢复以前在文献中讨论过的许多模型,例如“ Hopfield Networks就是您所需要的”论文中介绍的模型。我们还提供了上述论文中提出的能量函数和更新规则的替代推导,并阐明了该类别的各种模型之间的关系。

Dense Associative Memories or modern Hopfield networks permit storage and reliable retrieval of an exponentially large (in the dimension of feature space) number of memories. At the same time, their naive implementation is non-biological, since it seemingly requires the existence of many-body synaptic junctions between the neurons. We show that these models are effective descriptions of a more microscopic (written in terms of biological degrees of freedom) theory that has additional (hidden) neurons and only requires two-body interactions between them. For this reason our proposed microscopic theory is a valid model of large associative memory with a degree of biological plausibility. The dynamics of our network and its reduced dimensional equivalent both minimize energy (Lyapunov) functions. When certain dynamical variables (hidden neurons) are integrated out from our microscopic theory, one can recover many of the models that were previously discussed in the literature, e.g. the model presented in "Hopfield Networks is All You Need" paper. We also provide an alternative derivation of the energy function and the update rule proposed in the aforementioned paper and clarify the relationships between various models of this class.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源