论文标题

SBSNN:随机位启用了二进制尖峰神经网络,并在边缘进行节能神经形态计算,以进行芯片学习

sBSNN: Stochastic-Bits Enabled Binary Spiking Neural Network with On-Chip Learning for Energy Efficient Neuromorphic Computing at the Edge

论文作者

Koo, Minsuk, Srinivasan, Gopalakrishnan, Shim, Yong, Roy, Kaushik

论文摘要

在这项工作中,我们提出了由随机尖峰神经元和二进制突触(仅在训练期间随机)组成的随机二元尖峰神经网络(SBSNN),该神经元和二进制突触(仅在训练期间)以一位精度计算出一种较高的精度,以供一位精度和记忆压缩的神经形态计算。我们提出了使用“随机位”作为核心计算原始的拟议SBSNN的节能实施,以实现在90nm CMOS过程中制造的随机神经元和突触,以实现有效的芯片训练训练和图像识别任务的推理。测量的数据表明,“随机位”可以编程为模仿尖峰神经元,以及用于训练而没有昂贵的随机数生成器的二进制突触权重的随机尖峰定时依赖性可塑性(或SSTDP)规则。我们的结果表明,与全精度(32位)SNN相比,提出的SBSNN实现可提供高达32倍的神经元和突触记忆压缩,而两层完全连接的SNN的能源效率为89.49台上/瓦。

In this work, we propose stochastic Binary Spiking Neural Network (sBSNN) composed of stochastic spiking neurons and binary synapses (stochastic only during training) that computes probabilistically with one-bit precision for power-efficient and memory-compressed neuromorphic computing. We present an energy-efficient implementation of the proposed sBSNN using 'stochastic bit' as the core computational primitive to realize the stochastic neurons and synapses, which are fabricated in 90nm CMOS process, to achieve efficient on-chip training and inference for image recognition tasks. The measured data shows that the 'stochastic bit' can be programmed to mimic spiking neurons, and stochastic Spike Timing Dependent Plasticity (or sSTDP) rule for training the binary synaptic weights without expensive random number generators. Our results indicate that the proposed sBSNN realization offers possibility of up to 32x neuronal and synaptic memory compression compared to full precision (32-bit) SNN and energy efficiency of 89.49 TOPS/Watt for two-layer fully-connected SNN.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源