论文标题
通过抽样多相任务的几个射击课程学习
Few-Shot Class-Incremental Learning by Sampling Multi-Phase Tasks
论文作者
论文摘要
新课程经常出现在我们不断变化的世界中,例如社交媒体中的新兴主题和电子商务中的新产品。模型应识别新的类,同时维持对旧类的可区分性。在严重的情况下,只有有限的新颖实例可以逐步更新模型。在不忘记旧课程的情况下识别几个新课程的任务称为少数类的课堂学习学习(FSCIL)。在这项工作中,我们通过学习多相增量任务(limit)提出了一个基于元学习的FSCIL的新范式,该任务综合了基本数据集中的伪造FSCIL任务。假任务的数据格式与“真实”的增量任务一致,我们可以通过元学习构建可概括的特征空间。此外,限制还基于变压器构建了一个校准模块,该模块将旧类分类器和新类原型校准成相同的比例,并填补语义间隙。校准模块还可以通过设置对集合函数自适应地将特定于实例的嵌入方式化。限制有效地适应新课程,同时拒绝忘记旧课程。在三个基准数据集(CIFAR100,Miniimagenet和Cub200)和大规模数据集上进行的实验,即Imagenet ILSVRC2012验证可以限制实现最新性能的验证。
New classes arise frequently in our ever-changing world, e.g., emerging topics in social media and new types of products in e-commerce. A model should recognize new classes and meanwhile maintain discriminability over old classes. Under severe circumstances, only limited novel instances are available to incrementally update the model. The task of recognizing few-shot new classes without forgetting old classes is called few-shot class-incremental learning (FSCIL). In this work, we propose a new paradigm for FSCIL based on meta-learning by LearnIng Multi-phase Incremental Tasks (LIMIT), which synthesizes fake FSCIL tasks from the base dataset. The data format of fake tasks is consistent with the `real' incremental tasks, and we can build a generalizable feature space for the unseen tasks through meta-learning. Besides, LIMIT also constructs a calibration module based on transformer, which calibrates the old class classifiers and new class prototypes into the same scale and fills in the semantic gap. The calibration module also adaptively contextualizes the instance-specific embedding with a set-to-set function. LIMIT efficiently adapts to new classes and meanwhile resists forgetting over old classes. Experiments on three benchmark datasets (CIFAR100, miniImageNet, and CUB200) and large-scale dataset, i.e., ImageNet ILSVRC2012 validate that LIMIT achieves state-of-the-art performance.