论文标题
元数据:元决策树,具有类层次结构,用于可解释的几次学习
MetaDT: Meta Decision Tree with Class Hierarchy for Interpretable Few-Shot Learning
论文作者
论文摘要
很少的学习(FSL)是一项具有挑战性的任务,旨在以几乎没有例子来识别新颖的课程。最近,从元学习和表示学习的角度提出了许多方法。但是,很少有作品专注于FSL决策过程的解释性。在本文中,我们通过提出一个基于元学习的新决策树框架(即元数据)迈出了可解释的FSL。特别是,FSL可解释性是从两个方面(即概念方面和视觉方面)实现的。在概念方面,我们首先将类似树状的概念层次结构作为FSL先验。然后,求助于先验,我们将每个几杆任务分为具有不同概念级别的子任务,然后通过决策树的模型执行班级预测。这种设计的优点是,可以获得一系列高级概念决策,这些决策可以得到最终的预测,从而阐明了FSL决策过程。在视觉方面,具有视觉注意机制的一组子任务特异性分类器旨在在决策树的每个节点上执行决策。结果,可以获得子任务特异性的热图可视化,以实现每个树节点的决策解释性。最后,为了减轻FSL的数据稀缺问题,我们将概念层次结构的先前图视为一个无方向的图,然后将基于图形卷积的决策树推理网络设计为我们的元学习者,以推断决策树的参数。关于绩效比较和可解释性分析的广泛实验表明我们的元数据的优势。
Few-Shot Learning (FSL) is a challenging task, which aims to recognize novel classes with few examples. Recently, lots of methods have been proposed from the perspective of meta-learning and representation learning. However, few works focus on the interpretability of FSL decision process. In this paper, we take a step towards the interpretable FSL by proposing a novel meta-learning based decision tree framework, namely, MetaDT. In particular, the FSL interpretability is achieved from two aspects, i.e., a concept aspect and a visual aspect. On the concept aspect, we first introduce a tree-like concept hierarchy as FSL prior. Then, resorting to the prior, we split each few-shot task to a set of subtasks with different concept levels and then perform class prediction via a model of decision tree. The advantage of such design is that a sequence of high-level concept decisions that lead up to a final class prediction can be obtained, which clarifies the FSL decision process. On the visual aspect, a set of subtask-specific classifiers with visual attention mechanism is designed to perform decision at each node of the decision tree. As a result, a subtask-specific heatmap visualization can be obtained to achieve the decision interpretability of each tree node. At last, to alleviate the data scarcity issue of FSL, we regard the prior of concept hierarchy as an undirected graph, and then design a graph convolution-based decision tree inference network as our meta-learner to infer parameters of the decision tree. Extensive experiments on performance comparison and interpretability analysis show superiority of our MetaDT.