论文标题
特权的先前信息蒸馏图像垫子
Privileged Prior Information Distillation for Image Matting
论文作者
论文摘要
试图解除确定性和不确定的区域时,尤其是在前景是语义上模棱两可,色彩或高传播的场景时,无需三图图像效果方法的性能受到限制。在本文中,我们提出了一个名为“特权先前信息蒸馏”(PPID-IM)的新颖框架,该框架可以有效地传递特权的先验环境意识信息,以提高学生在解决硬式前景方面的表现。 Trimap的先前信息仅在培训阶段调节教师模型,而在实际推论期间未进入学生网络。为了实现有效的特权交叉模式(即Trimap和RGB)信息蒸馏,我们引入了跨层次的语义蒸馏(CLSD)模块,该模块增强了具有更多知识的语义表示和环境意识的无限制学生。我们还提出了一个注意力引入的本地蒸馏模块,该模块将有效的特权本地属性从基于Trimap的教师转移到了无Trimap的学生,以提供本地区域优化的指导。广泛的实验证明了我们的PPID框架对图像垫的任务的有效性和优势。此外,我们的无夹层indexnet-pppid超过了其他竞争的最先进方法,尤其是在具有色彩,弱质地或不规则物体的情况下。
Performance of trimap-free image matting methods is limited when trying to decouple the deterministic and undetermined regions, especially in the scenes where foregrounds are semantically ambiguous, chromaless, or high transmittance. In this paper, we propose a novel framework named Privileged Prior Information Distillation for Image Matting (PPID-IM) that can effectively transfer privileged prior environment-aware information to improve the performance of students in solving hard foregrounds. The prior information of trimap regulates only the teacher model during the training stage, while not being fed into the student network during actual inference. In order to achieve effective privileged cross-modality (i.e. trimap and RGB) information distillation, we introduce a Cross-Level Semantic Distillation (CLSD) module that reinforces the trimap-free students with more knowledgeable semantic representations and environment-aware information. We also propose an Attention-Guided Local Distillation module that efficiently transfers privileged local attributes from the trimap-based teacher to trimap-free students for the guidance of local-region optimization. Extensive experiments demonstrate the effectiveness and superiority of our PPID framework on the task of image matting. In addition, our trimap-free IndexNet-PPID surpasses the other competing state-of-the-art methods by a large margin, especially in scenarios with chromaless, weak texture, or irregular objects.