论文标题
Lipschitz通过稀疏多项式优化对神经网络的持续估计
Lipschitz constant estimation of Neural Networks via sparse polynomial optimization
论文作者
论文摘要
我们介绍了Lipopt,这是一个多项式优化框架,用于计算神经网络Lipschitz常数越来越紧密的上限。基础优化问题归结为线性(LP)或半芬特(SDP)编程。我们展示了如何使用网络的稀疏连接,以显着降低计算的复杂性。这对于卷积和修剪的神经网络特别有用。我们对具有随机权重的网络以及对MNIST训练的网络进行实验,这表明,在$ \ ell_ \ infty $ -lipschitz常数的特定情况下,与文献中可用的基线相比,我们的方法得出了较高的估计。
We introduce LiPopt, a polynomial optimization framework for computing increasingly tighter upper bounds on the Lipschitz constant of neural networks. The underlying optimization problems boil down to either linear (LP) or semidefinite (SDP) programming. We show how to use the sparse connectivity of a network, to significantly reduce the complexity of computation. This is specially useful for convolutional as well as pruned neural networks. We conduct experiments on networks with random weights as well as networks trained on MNIST, showing that in the particular case of the $\ell_\infty$-Lipschitz constant, our approach yields superior estimates, compared to baselines available in the literature.