论文标题

多源域适应的教师学生一致性

Teacher-Student Consistency For Multi-Source Domain Adaptation

论文作者

Amosy, Ohad, Chechik, Gal

论文摘要

在多源域适应性(MSDA)中,对来自多个源域的样品进行了训练,并用于推断不同的目标,域。主流域适应方法学习源和目标域的联合表示。不幸的是,联合表示形式可能会强调对源域有用但损害目标推断(负转移)或删除有关目标域(知识褪色)的基本信息的特征。 我们建议多源学生老师(必须),这是一种旨在减轻这些问题的新型程序。关键想法有两个步骤:首先,我们在源标签上训练教师网络,并推断目标上的伪标签。然后,我们使用伪标签训练学生网络,并将老师正规化以适合学生的预测。这种正规化有助于教师对目标数据的预测在时期之间保持一致。必须对三个MSDA基准测试的评估:数字,文本情感分析和视觉对象识别表明,必须胜过当前SOTA,有时要大得多。我们进一步分析了优化的解决方案和动力学,表明学习模型遵循目标分布密度,隐含地将其用作未标记的目标数据中的信息。

In Multi-Source Domain Adaptation (MSDA), models are trained on samples from multiple source domains and used for inference on a different, target, domain. Mainstream domain adaptation approaches learn a joint representation of source and target domains. Unfortunately, a joint representation may emphasize features that are useful for the source domains but hurt inference on target (negative transfer), or remove essential information about the target domain (knowledge fading). We propose Multi-source Student Teacher (MUST), a novel procedure designed to alleviate these issues. The key idea has two steps: First, we train a teacher network on source labels and infer pseudo labels on the target. Then, we train a student network using the pseudo labels and regularized the teacher to fit the student predictions. This regularization helps the teacher predictions on the target data remain consistent between epochs. Evaluations of MUST on three MSDA benchmarks: digits, text sentiment analysis, and visual-object recognition show that MUST outperforms current SoTA, sometimes by a very large margin. We further analyze the solutions and the dynamics of the optimization showing that the learned models follow the target distribution density, implicitly using it as information within the unlabeled target data.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源