论文标题

Markov-Lipschitz深度学习

Markov-Lipschitz Deep Learning

论文作者

Li, Stan Z., Zang, Zelin, Wu, Lirong

论文摘要

我们提出了一个名为Markov-Lipschitz深度学习(MLDL)的新型框架,以应对基于媒介的神经网络转换引起的几何劣化,以基于媒介的神经网络变换,以基于流形的表示和流形数据生成。先前的约束,称为局部等距平滑度(LIS),被施加在跨层中,并编码为Markov随机场(MRF) - GIBBS分布。通过局部几何变形和局部Bi-Lipschitz的连续性测量,这导致了最佳的局部几何保存和鲁棒性的解决方案。因此,层的矢量变换被增强为行为良好的,受限的指标同构。广泛的实验,比较和消融研究表明,MLDL在流形学习和流动数据生成方面具有显着优势。 MLDL足够通用,可以增强基于向量转换的任何网络。该代码可从https://github.com/westlake-cairi/markov-lipschitz-deep-learning获得。

We propose a novel framework, called Markov-Lipschitz deep learning (MLDL), to tackle geometric deterioration caused by collapse, twisting, or crossing in vector-based neural network transformations for manifold-based representation learning and manifold data generation. A prior constraint, called locally isometric smoothness (LIS), is imposed across-layers and encoded into a Markov random field (MRF)-Gibbs distribution. This leads to the best possible solutions for local geometry preservation and robustness as measured by locally geometric distortion and locally bi-Lipschitz continuity. Consequently, the layer-wise vector transformations are enhanced into well-behaved, LIS-constrained metric homeomorphisms. Extensive experiments, comparisons, and ablation study demonstrate significant advantages of MLDL for manifold learning and manifold data generation. MLDL is general enough to enhance any vector transformation-based networks. The code is available at https://github.com/westlake-cairi/Markov-Lipschitz-Deep-Learning.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源