论文标题

Lipschitz对1D卷积神经网络的持续估计

Lipschitz constant estimation for 1D convolutional neural networks

论文作者

Pauli, Patricia, Gramlich, Dennis, Allgöwer, Frank

论文摘要

在这项工作中,我们提出了一种基于耗散性的方法,用于LIPSCHITZ对1D卷积神经网络(CNN)的持续估计。特别是,我们分析了利用增量二次约束来用于非线性激活功能和汇总操作的卷积,合并和完全连接的层的耗散性能。然后,通过求解我们从消散性理论得出的半决赛程序来估计这些映射串联的Lipschitz常数。为了使我们的方法提高效率,我们通过将这些有限的脉冲响应过滤器视为状态空间中的因果动力学系统,并对状态空间实现进行消散性分析来利用卷积层的结构。我们提供的示例表明,在准确性和可扩展性方面,我们的Lipschitz边界是有利的。

In this work, we propose a dissipativity-based method for Lipschitz constant estimation of 1D convolutional neural networks (CNNs). In particular, we analyze the dissipativity properties of convolutional, pooling, and fully connected layers making use of incremental quadratic constraints for nonlinear activation functions and pooling operations. The Lipschitz constant of the concatenation of these mappings is then estimated by solving a semidefinite program which we derive from dissipativity theory. To make our method as efficient as possible, we exploit the structure of convolutional layers by realizing these finite impulse response filters as causal dynamical systems in state space and carrying out the dissipativity analysis for the state space realizations. The examples we provide show that our Lipschitz bounds are advantageous in terms of accuracy and scalability.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源