论文标题
在线深度聚类,用于无监督的代表性学习
Online Deep Clustering for Unsupervised Representation Learning
论文作者
论文摘要
联合聚类和特征学习方法在无监督的表示学习中表现出色。但是,培训时间表在功能群集和网络参数之间交替,更新会导致视觉表示的不稳定学习。为了克服这一挑战,我们建议在线深度聚类(ODC)同时执行聚类和网络更新,而不是交替。我们的关键见解是,群集质心应稳步发展,以保持分类器稳定更新。具体而言,我们设计和维护两个动态内存模块,即样品存储器以存储样品标签和特征,以及用于质心演化的质心内存。我们将突然的全局聚类分解为稳定内存更新和批次标签重新分配。该过程集成到网络更新迭代中。通过这种方式,标签和网络可以肩部发展而不是交替。广泛的实验表明,ODC可以稳定训练过程并有效地提高性能。代码:https://github.com/open-mmlab/openselfsup。
Joint clustering and feature learning methods have shown remarkable performance in unsupervised representation learning. However, the training schedule alternating between feature clustering and network parameters update leads to unstable learning of visual representations. To overcome this challenge, we propose Online Deep Clustering (ODC) that performs clustering and network update simultaneously rather than alternatingly. Our key insight is that the cluster centroids should evolve steadily in keeping the classifier stably updated. Specifically, we design and maintain two dynamic memory modules, i.e., samples memory to store samples labels and features, and centroids memory for centroids evolution. We break down the abrupt global clustering into steady memory update and batch-wise label re-assignment. The process is integrated into network update iterations. In this way, labels and the network evolve shoulder-to-shoulder rather than alternatingly. Extensive experiments demonstrate that ODC stabilizes the training process and boosts the performance effectively. Code: https://github.com/open-mmlab/OpenSelfSup.