论文标题
CSNNS:无监督,无反向传播的无卷积神经网络,用于表示学习
CSNNs: Unsupervised, Backpropagation-free Convolutional Neural Networks for Representation Learning
论文作者
论文摘要
这项工作结合了卷积神经网络(CNN),通过自组织地图(SOMS)和Hebbian学习进行聚类,以提出卷积自我组织神经网络(CSNN)的基础,以一种无人看出和反质量的方式学习表示形式。我们的方法用SOM的竞争性学习程序取代了从CNN中学习传统的卷积层的学习,并同时学习具有单独的HEBBIAN的学习规则的那些层之间的本地口罩,以克服在通过聚类中学习过滤器的变化因素的问题。我们通过设计两个简单的模型来研究学习的表示形式,从而实现了与使用反向传播的许多方法相当的性能,而我们在CIFAR10上达到了可比性的性能,并在CIFAR100,Tiny Imagenet,Tiny Imagenet和一个小的ImageNet子集上提供了基线表演,以获取无回头磁的方法。
This work combines Convolutional Neural Networks (CNNs), clustering via Self-Organizing Maps (SOMs) and Hebbian Learning to propose the building blocks of Convolutional Self-Organizing Neural Networks (CSNNs), which learn representations in an unsupervised and Backpropagation-free manner. Our approach replaces the learning of traditional convolutional layers from CNNs with the competitive learning procedure of SOMs and simultaneously learns local masks between those layers with separate Hebbian-like learning rules to overcome the problem of disentangling factors of variation when filters are learned through clustering. We investigate the learned representation by designing two simple models with our building blocks, achieving comparable performance to many methods which use Backpropagation, while we reach comparable performance on Cifar10 and give baseline performances on Cifar100, Tiny ImageNet and a small subset of ImageNet for Backpropagation-free methods.