论文标题

关于深度学习中部分距离相关的多功能用途

On the Versatile Uses of Partial Distance Correlation in Deep Learning

论文作者

Zhen, Xingjian, Meng, Zihang, Chakraborty, Rudrasis, Singh, Vikas

论文摘要

比较神经网络模型的功能行为,无论是在培训期间还是两个(或更多的网络),无论它是一个单个网络,是了解他们正在学习的内容(以及他们不是什么)的重要步骤,以及确定正规化或效率改进的策略。尽管最近的进展,例如,将视觉变压器与CNN进行比较,但系统比较功能,尤其是在不同网络中,仍然很困难,并且通常是按一层进行的。诸如规范相关分析(CCA)之类的方法原则上是适用的,但到目前为止已经很少使用。在本文中,我们从统计数据(及其部分变体)中重新访问(较不知名的),旨在评估不同维度的特征空间之间的相关性。我们描述了为大型模型进行部署所需的步骤 - 这为令人惊讶的应用程序打开了一扇门,从调理一个深层模型W.R.T.另一个,学习分解了表示形式,并优化了多种模型,这些模型将直接对对抗性攻击更加强大。我们的实验表明,具有许多优点的多功能正规剂(或约束),这避免了此类分析中人们面临的一些常见困难。代码在https://github.com/zhenxingjian/partial_distance_correlation。

Comparing the functional behavior of neural network models, whether it is a single network over time or two (or more networks) during or post-training, is an essential step in understanding what they are learning (and what they are not), and for identifying strategies for regularization or efficiency improvements. Despite recent progress, e.g., comparing vision transformers to CNNs, systematic comparison of function, especially across different networks, remains difficult and is often carried out layer by layer. Approaches such as canonical correlation analysis (CCA) are applicable in principle, but have been sparingly used so far. In this paper, we revisit a (less widely known) from statistics, called distance correlation (and its partial variant), designed to evaluate correlation between feature spaces of different dimensions. We describe the steps necessary to carry out its deployment for large scale models -- this opens the door to a surprising array of applications ranging from conditioning one deep model w.r.t. another, learning disentangled representations as well as optimizing diverse models that would directly be more robust to adversarial attacks. Our experiments suggest a versatile regularizer (or constraint) with many advantages, which avoids some of the common difficulties one faces in such analyses. Code is at https://github.com/zhenxingjian/Partial_Distance_Correlation.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源