论文标题

通过循环群结构学习展开的网络

Learning unfolded networks with a cyclic group structure

论文作者

Theodosis, Emmanouil, Ba, Demba

论文摘要

深层神经网络缺乏融合领域知识的直接方法,并且被众所周知被认为是黑匣子。先前的工作试图通过数据扩展将域知识置于架构中。在近期的模棱两可的神经网络的基础上,我们提出了明确编码域知识的网络,特别是关于旋转的均衡性。通过使用展开的体系结构,这是一个源自稀疏编码并具有理论保证的丰富框架,我们提出了具有稀疏激活的可解释网络。如在(旋转的)MNIST和CIFAR-10上所示的基线的竞争与基准相比,其参数仅为参数。

Deep neural networks lack straightforward ways to incorporate domain knowledge and are notoriously considered black boxes. Prior works attempted to inject domain knowledge into architectures implicitly through data augmentation. Building on recent advances on equivariant neural networks, we propose networks that explicitly encode domain knowledge, specifically equivariance with respect to rotations. By using unfolded architectures, a rich framework that originated from sparse coding and has theoretical guarantees, we present interpretable networks with sparse activations. The equivariant unfolded networks compete favorably with baselines, with only a fraction of their parameters, as showcased on (rotated) MNIST and CIFAR-10.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源