论文标题
道路标记分段的区域间亲和力蒸馏
Inter-Region Affinity Distillation for Road Marking Segmentation
论文作者
论文摘要
我们研究了将知识从大型教师网络提炼到小得多的学生网络的问题,以进行道路标记的任务。在这项工作中,我们探索了一种新颖的知识蒸馏(KD)方法,可以将“知识”从教师转移到学生模型上。我们的方法称为区域间亲和力KD(Intra-KD)。它将给定的道路场景图像分解为不同的区域,并将每个区域表示为图中的节点。然后,通过基于特征分布的相似性在节点之间建立成对关系来形成区域间亲和力图。要从教师网络中学习结构知识,要求学生匹配老师生成的图。提出的方法通过将各种轻量级模型作为学生和Resnet-101作为老师,在三个大规模的公路标记分段基准(即Apolloscape,Culane和Llamas)上显示出令人鼓舞的结果。与以前的蒸馏方法相比,Intra-KD始终在所有轻型模型上带来更高的性能增长。我们的代码可在https://github.com/cardwing/codes-for-intra-kd上找到。
We study the problem of distilling knowledge from a large deep teacher network to a much smaller student network for the task of road marking segmentation. In this work, we explore a novel knowledge distillation (KD) approach that can transfer 'knowledge' on scene structure more effectively from a teacher to a student model. Our method is known as Inter-Region Affinity KD (IntRA-KD). It decomposes a given road scene image into different regions and represents each region as a node in a graph. An inter-region affinity graph is then formed by establishing pairwise relationships between nodes based on their similarity in feature distribution. To learn structural knowledge from the teacher network, the student is required to match the graph generated by the teacher. The proposed method shows promising results on three large-scale road marking segmentation benchmarks, i.e., ApolloScape, CULane and LLAMAS, by taking various lightweight models as students and ResNet-101 as the teacher. IntRA-KD consistently brings higher performance gains on all lightweight models, compared to previous distillation methods. Our code is available at https://github.com/cardwing/Codes-for-IntRA-KD.