论文标题
关于基于梯度的元学习的子空间结构
On the Subspace Structure of Gradient-Based Meta-Learning
论文作者
论文摘要
在这项工作中,我们对基于梯度的元学习(GBML)方法的适应后参数的分布进行了分析。先前的工作已经注意到,对于图像分类的情况,这种适应仅发生在网络的最后一层。我们提出了更一般的观念,即参数通过与任务空间相同的维度的低维\ emph {subpace}进行更新,并表明这也可以进行回归。此外,诱导的子空间结构提供了一种方法来估计常见少数学习数据集任务空间的内在维度。
In this work we provide an analysis of the distribution of the post-adaptation parameters of Gradient-Based Meta-Learning (GBML) methods. Previous work has noticed how, for the case of image-classification, this adaptation only takes place on the last layers of the network. We propose the more general notion that parameters are updated over a low-dimensional \emph{subspace} of the same dimensionality as the task-space and show that this holds for regression as well. Furthermore, the induced subspace structure provides a method to estimate the intrinsic dimension of the space of tasks of common few-shot learning datasets.