论文标题

SpiceProp:通过回忆性尖峰神经网络反向传播错误

SPICEprop: Backpropagating Errors Through Memristive Spiking Neural Networks

论文作者

Zhou, Peng, Eshraghian, Jason K., Choi, Dong-Uk, Kang, Sung-Mo

论文摘要

我们提出了一个完整的回忆性尖峰神经网络(MSNN),该神经网络由新型的回忆神经元组成,这些神经元是使用时间(BPTT)学习规则训练的新型神经元。梯度下降直接应用于使用模拟香料电路模型设计的回忆积分(MIF)神经元,该神经元可产生独特的去极化,超极化和重极化电压波形。使用MIF神经元模型的膜电位训练突触权重,可以在回忆横梁上处理。 MIF神经元模型的自然尖峰动力学是完全可区分的,从而消除了对尖峰神经网络文献中普遍存在的梯度近似的需求。尽管直接在香料电路模型上培训的复杂性增加了,但我们在MNIST测试数据集上达到了97.58%的精度,而时尚级别测试数据集则达到了75.26%,这是所有完全MSNN的最高精度。

We present a fully memristive spiking neural network (MSNN) consisting of novel memristive neurons trained using the backpropagation through time (BPTT) learning rule. Gradient descent is applied directly to the memristive integrated-and-fire (MIF) neuron designed using analog SPICE circuit models, which generates distinct depolarization, hyperpolarization, and repolarization voltage waveforms. Synaptic weights are trained by BPTT using the membrane potential of the MIF neuron model and can be processed on memristive crossbars. The natural spiking dynamics of the MIF neuron model are fully differentiable, eliminating the need for gradient approximations that are prevalent in the spiking neural network literature. Despite the added complexity of training directly on SPICE circuit models, we achieve 97.58% accuracy on the MNIST testing dataset and 75.26% on the Fashion-MNIST testing dataset, the highest accuracies among all fully MSNNs.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源