论文标题

GDC广义分配校准,用于几次学习

GDC- Generalized Distribution Calibration for Few-Shot Learning

论文作者

Kumar, Shakti, Zaidi, Hussain

论文摘要

很少有射击学习是机器学习的重要问题,因为大型标记的数据集需要大量时间和精力来组装。大多数射击学习算法都有两个局限性之一 - 它们要么需要设计复杂的模型和损失功能,从而妨碍了可解释性。或采用统计技术,但要做出可能无法在不同的数据集或功能中持有的假设。在推断出从最相似的较大类中推断出小样本类别的分布的最新工作,我们提出了一种广义抽样方法,该方法学会了估算几个分布的分布,以分类为所有大型类的加权随机变量。我们使用一种协方差缩小形式,由于过度参数化的特征或小数据集,对奇异协方差提供了鲁棒性。我们表明,即使在培训集中没有类似的大型类别的情况下,我们的采样点也接近几乎没有。我们的方法可与Miniimagenet,Cub和Stanford Dogs数据集的任意现成功能提取器一起使用,在5way-1 Shot和5way-5shot Task上的现有最先进的狗数据集的现有最先进,在具有挑战性的跨域任务中占1%。

Few shot learning is an important problem in machine learning as large labelled datasets take considerable time and effort to assemble. Most few-shot learning algorithms suffer from one of two limitations- they either require the design of sophisticated models and loss functions, thus hampering interpretability; or employ statistical techniques but make assumptions that may not hold across different datasets or features. Developing on recent work in extrapolating distributions of small sample classes from the most similar larger classes, we propose a Generalized sampling method that learns to estimate few-shot distributions for classification as weighted random variables of all large classes. We use a form of covariance shrinkage to provide robustness against singular covariances due to overparameterized features or small datasets. We show that our sampled points are close to few-shot classes even in cases when there are no similar large classes in the training set. Our method works with arbitrary off-the-shelf feature extractors and outperforms existing state-of-the-art on miniImagenet, CUB and Stanford Dogs datasets by 3% to 5% on 5way-1shot and 5way-5shot tasks and by 1% in challenging cross domain tasks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源