论文标题

发育塑性启发的自适应修剪,用于深尖峰和人工神经网络

Developmental Plasticity-inspired Adaptive Pruning for Deep Spiking and Artificial Neural Networks

论文作者

Han, Bing, Zhao, Feifei, Zeng, Yi, Shen, Guobin

论文摘要

发育可塑性在响应动态变化的环境中持续学习过程中塑造大脑的结构中起着重要作用。但是,现有的深层人工神经网络(ANN)和尖峰神经网络(SNN)的网络压缩方法从大脑的发展可塑性机制中汲取了很少的灵感,从而限制了它们有效,快速,准确地学习的能力。本文提出了一种发展性可塑性灵感的自适应修剪(DPAP)方法,其灵感来自于根据“使用或丢失”的自适应发育修剪的树突状刺,突触和神经元的自适应修剪,逐渐衰减“原则上”。痕迹和局部突触可塑性,具有其他自适应修剪策略,因此在学习过程中可以动态优化网络结构,而无需进行任何预先训练和训练工作探讨了发展可塑性如何使复杂的深网逐渐发展为脑般的有效和紧凑的结构,最终实现了生物学上现实的SNN的最新性能(SOTA)性能。

Developmental plasticity plays a prominent role in shaping the brain's structure during ongoing learning in response to dynamically changing environments. However, the existing network compression methods for deep artificial neural networks (ANNs) and spiking neural networks (SNNs) draw little inspiration from brain's developmental plasticity mechanisms, thus limiting their ability to learn efficiently, rapidly, and accurately. This paper proposed a developmental plasticity-inspired adaptive pruning (DPAP) method, with inspiration from the adaptive developmental pruning of dendritic spines, synapses, and neurons according to the ``use it or lose it, gradually decay" principle. The proposed DPAP model considers multiple biologically realistic mechanisms (such as dendritic spine dynamic plasticity, activity-dependent neural spiking trace, and local synaptic plasticity), with additional adaptive pruning strategy, so that the network structure can be dynamically optimized during learning without any pre-training and retraining. Extensive comparative experiments show consistent and remarkable performance and speed boost with the extremely compressed networks on a diverse set of benchmark tasks for deep ANNs and SNNs, especially the spatio-temporal joint pruning of SNNs in neuromorphic datasets. This work explores how developmental plasticity enables complex deep networks to gradually evolve into brain-like efficient and compact structures, eventually achieving state-of-the-art (SOTA) performance for biologically realistic SNNs.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源