论文标题

通过学习激活功能来改善Lipschitz受限的神经网络

Improving Lipschitz-Constrained Neural Networks by Learning Activation Functions

论文作者

Ducotterd, Stanislas, Goujon, Alexis, Bohra, Pakshal, Perdios, Dimitris, Neumayer, Sebastian, Unser, Michael

论文摘要

Lipschitz受限的神经网络比无约束的神经网络具有多个优势,可以应用于各种问题,使其成为深度学习社区中的关注主题。不幸的是,在理论上和经验上都表明,配备了Relu激活函数时,它们的性能很差。相比之下,已知具有可学习的1-lipschitz线性花纹的神经网络更具表现力。在本文中,我们表明,此类网络对应于受约束功能优化问题的全局优化问题,该问题由训练由1-lipschitz线性层组成的神经网络和具有二阶总差异正则化的1-Lipschitz线性层和1-Lipschitz Freeform激活功能。此外,我们提出了一种培训这些神经网络的有效方法。我们的数值实验表明,我们训练的网络与现有的1-Lipschitz神经体系结构相比有利。

Lipschitz-constrained neural networks have several advantages over unconstrained ones and can be applied to a variety of problems, making them a topic of attention in the deep learning community. Unfortunately, it has been shown both theoretically and empirically that they perform poorly when equipped with ReLU activation functions. By contrast, neural networks with learnable 1-Lipschitz linear splines are known to be more expressive. In this paper, we show that such networks correspond to global optima of a constrained functional optimization problem that consists of the training of a neural network composed of 1-Lipschitz linear layers and 1-Lipschitz freeform activation functions with second-order total-variation regularization. Further, we propose an efficient method to train these neural networks. Our numerical experiments show that our trained networks compare favorably with existing 1-Lipschitz neural architectures.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源