论文标题
基于示例的对比度自我监督学习,几乎没有阶级的增量学习
Exemplar-Based Contrastive Self-Supervised Learning with Few-Shot Class Incremental Learning
论文作者
论文摘要
人类能够从渐进且不断地从少数(标记)示例中学习新概念。这发生在我们可以区分示例,示例和大量其他数据(未标记和标记)之间的上下文中。这表明,在人类学习中,基于示例的概念的监督学习发生在基于未标记和标记的数据的更大的对比自我监督学习(CSSL)的情况下。我们讨论将CSSL(1)扩展为主要基于示例,其次仅基于数据增强,(2)适用于两个未标记的数据(一般可以提供大量数据)和标记的数据(可以使用有价值的监督知识获得一些示例)。扩展的主要好处是,基于示例的CSSL具有监督的填充,支持了几个阶级的增量学习(CIL)。具体而言,我们讨论了基于典范的CSSL,包括:最近的邻居CSSL,带有监督预处理的邻里CSSL,以及带有监督的Finetuning的示例CSSL。我们进一步讨论使用基于示例的CSSL来促进几乎没有射击的学习,尤其是很少的CIL。
Humans are capable of learning new concepts from only a few (labeled) exemplars, incrementally and continually. This happens within the context that we can differentiate among the exemplars, and between the exemplars and large amounts of other data (unlabeled and labeled). This suggests, in human learning, supervised learning of concepts based on exemplars takes place within the larger context of contrastive self-supervised learning (CSSL) based on unlabeled and labeled data. We discuss extending CSSL (1) to be based mainly on exemplars and only secondly on data augmentation, and (2) to apply to both unlabeled data (a large amount is available in general) and labeled data (a few exemplars can be obtained with valuable supervised knowledge). A major benefit of the extensions is that exemplar-based CSSL, with supervised finetuning, supports few-shot class incremental learning (CIL). Specifically, we discuss exemplar-based CSSL including: nearest-neighbor CSSL, neighborhood CSSL with supervised pretraining, and exemplar CSSL with supervised finetuning. We further discuss using exemplar-based CSSL to facilitate few-shot learning and, in particular, few-shot CIL.