论文标题

通过功能适应的记忆效率增量学习

Memory-Efficient Incremental Learning Through Feature Adaptation

论文作者

Iscen, Ahmet, Zhang, Jeffrey, Lazebnik, Svetlana, Schmid, Cordelia

论文摘要

我们介绍了一种逐步学习的方法,该方法保留了从以前学习的课程中的培训图像的描述符,而不是与大多数现有作品不同。保持较低的图像特征嵌入会大大减少记忆足迹。我们假设由于新数据依次可用,因此对新类的模型进行了逐步更新。这需要将先前存储的功能向量调整为更新的功能空间,而无需访问相应的原始培训图像。通过多层感知器学习功能适应,该功能是通过功能对训练,该功能对与训练图像上原始网络和更新网络的输出相对应。我们通过实验验证这种转换可以很好地推广到上一组类的特征,并且地图特征是特征空间中的判别子空间。结果,分类器在不需要旧类图像的情况下共同优化了新的和旧类。实验结果表明,我们的方法在增量学习基准中实现了最新的分类准确性,而与图像保存策略相比,记忆足迹至少要低的数量级。

We introduce an approach for incremental learning that preserves feature descriptors of training images from previously learned classes, instead of the images themselves, unlike most existing work. Keeping the much lower-dimensional feature embeddings of images reduces the memory footprint significantly. We assume that the model is updated incrementally for new classes as new data becomes available sequentially.This requires adapting the previously stored feature vectors to the updated feature space without having access to the corresponding original training images. Feature adaptation is learned with a multi-layer perceptron, which is trained on feature pairs corresponding to the outputs of the original and updated network on a training image. We validate experimentally that such a transformation generalizes well to the features of the previous set of classes, and maps features to a discriminative subspace in the feature space. As a result, the classifier is optimized jointly over new and old classes without requiring old class images. Experimental results show that our method achieves state-of-the-art classification accuracy in incremental learning benchmarks, while having at least an order of magnitude lower memory footprint compared to image-preserving strategies.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源