论文标题

超级一致性正则化

Hyperspherical Consistency Regularization

论文作者

Tan, Cheng, Gao, Zhangyang, Wu, Lirong, Li, Siyuan, Li, Stan Z.

论文摘要

对比度学习的最新进展启发了各种半监督领域的不同应用。共同培训监督的学习和无监督的学习与共享特征编码器成为一个共同的方案。尽管它受益于从自我监督学习和来自监督学习的标签依赖性信息中利用特征依赖的信息,但该方案仍然遭受分类器的偏见。在这项工作中,我们系统地探讨了自我监督的学习与监督学习之间的关系,并研究自我监督的学习如何帮助强大的数据有效的深度学习。我们提出了一种简单但有效的插入方法,使用特征依赖性信息正规化分类器,从而避免标签的偏见。具体而言,HCR首先要从分类器中投射逻辑,并从相应的Hypersphere上的投影头进行特征投影,然后通过最小化成对距离相似性指标的二进制交叉熵来实现超球体的数据点,以具有相似的结构。对半监督和弱监督的学习进行了广泛的实验,证明了我们方法的有效性,通过显示出卓越的HCR性能。

Recent advances in contrastive learning have enlightened diverse applications across various semi-supervised fields. Jointly training supervised learning and unsupervised learning with a shared feature encoder becomes a common scheme. Though it benefits from taking advantage of both feature-dependent information from self-supervised learning and label-dependent information from supervised learning, this scheme remains suffering from bias of the classifier. In this work, we systematically explore the relationship between self-supervised learning and supervised learning, and study how self-supervised learning helps robust data-efficient deep learning. We propose hyperspherical consistency regularization (HCR), a simple yet effective plug-and-play method, to regularize the classifier using feature-dependent information and thus avoid bias from labels. Specifically, HCR first projects logits from the classifier and feature projections from the projection head on the respective hypersphere, then it enforces data points on hyperspheres to have similar structures by minimizing binary cross entropy of pairwise distances' similarity metrics. Extensive experiments on semi-supervised and weakly-supervised learning demonstrate the effectiveness of our method, by showing superior performance with HCR.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源