论文标题
用于培训尖峰神经网络的神经形态数据增强
Neuromorphic Data Augmentation for Training Spiking Neural Networks
论文作者
论文摘要
用尖峰神经网络(SNN)对基于事件的数据集开发神经形态智能,最近引起了很多研究的关注。但是,基于事件的数据集的大小有限,使SNN易于过度拟合和不稳定的收敛性。以前的学术工作仍未探索这个问题。为了最大程度地减少这种泛化差距,我们提出了神经形态数据增强(NDA),这是一个专门为基于事件的数据集设计的几何增强家族,目的是显着稳定SNN训练并减少训练和测试性能之间的概括差距。所提出的方法简单且与现有的SNN训练管道兼容。我们首次使用所提出的增强作用,证明了无监督的SNN对比度学习的可行性。我们对盛行的神经形态视觉基准进行了全面的实验,并表明NDA比以前的最新结果可实现实质性改进。例如,基于NDA的SNN分别在CIFAR10-DV和N-Caltech 101上获得了101%和13.7%的准确性增长。代码可在github https://github.com/intelligent-computing-lab-yale/nda_snn上找到。
Developing neuromorphic intelligence on event-based datasets with Spiking Neural Networks (SNNs) has recently attracted much research attention. However, the limited size of event-based datasets makes SNNs prone to overfitting and unstable convergence. This issue remains unexplored by previous academic works. In an effort to minimize this generalization gap, we propose Neuromorphic Data Augmentation (NDA), a family of geometric augmentations specifically designed for event-based datasets with the goal of significantly stabilizing the SNN training and reducing the generalization gap between training and test performance. The proposed method is simple and compatible with existing SNN training pipelines. Using the proposed augmentation, for the first time, we demonstrate the feasibility of unsupervised contrastive learning for SNNs. We conduct comprehensive experiments on prevailing neuromorphic vision benchmarks and show that NDA yields substantial improvements over previous state-of-the-art results. For example, the NDA-based SNN achieves accuracy gain on CIFAR10-DVS and N-Caltech 101 by 10.1% and 13.7%, respectively. Code is available on GitHub https://github.com/Intelligent-Computing-Lab-Yale/NDA_SNN