论文标题

用于在线学习的时间神经网络体系结构

A Temporal Neural Network Architecture for Online Learning

论文作者

Smith, James E.

论文摘要

一个长期的主张是,通过模拟大脑新皮层的运作,尖峰神经网络(SNN)可以实现类似的理想特征:灵活的学习,速度和效率。时间神经网络(TNN)是SNN,它们可以传达和处理编码为相对尖峰时间的信息(与尖峰速率相反)。提出了TNN体系结构,作为概念验证的证明,在在线监督分类的较大背景下展示了TNN操作。首先,通过无监督的学习,基于相似性将TNN分区输入到集群中。然后,TNN将群集标识符传递给一个简单的在线监督解码器,该解码器完成了分类任务。 TNN学习过程仅使用每个突触的本地信号来调整突触权重,然后在全球范围内出现聚类行为。该系统体系结构以类似于栅极和传统数字设计中的登记册传输水平的抽象级别描述。除了整体体系结构的功能外,这项工作是新的DNN组件。尽管没有直接解决,但总体研究目标是TNN的直接硬件实现。因此,所有架构元素都很简单,并且处理的精度非常低。

A long-standing proposition is that by emulating the operation of the brain's neocortex, a spiking neural network (SNN) can achieve similar desirable features: flexible learning, speed, and efficiency. Temporal neural networks (TNNs) are SNNs that communicate and process information encoded as relative spike times (in contrast to spike rates). A TNN architecture is proposed, and, as a proof-of-concept, TNN operation is demonstrated within the larger context of online supervised classification. First, through unsupervised learning, a TNN partitions input patterns into clusters based on similarity. The TNN then passes a cluster identifier to a simple online supervised decoder which finishes the classification task. The TNN learning process adjusts synaptic weights by using only signals local to each synapse, and clustering behavior emerges globally. The system architecture is described at an abstraction level analogous to the gate and register transfer levels in conventional digital design. Besides features of the overall architecture, several TNN components are new to this work. Although not addressed directly, the overall research objective is a direct hardware implementation of TNNs. Consequently, all the architecture elements are simple, and processing is done at very low precision.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源