论文标题

判别特征对齐:提高无监督域对适应的可转移性,高斯引导潜在对齐

Discriminative Feature Alignment: Improving Transferability of Unsupervised Domain Adaptation by Gaussian-guided Latent Alignment

论文作者

Wang, Jing, Chen, Jiahong, Lin, Jianzhe, Sigal, Leonid, de Silva, Clarence W.

论文摘要

在这项研究中,我们关注的是无监督的域适应问题,其中应从标记的数据域中学习近似推理模型,并预计可以很好地推广到未标记的数据域。无监督域适应的成功在很大程度上取决于跨域特征对齐。以前的工作已尝试通过分类器诱导的差异直接使潜在特征保持一致。然而,不能总是通过此直接特征对齐方式学习一个共同的特征空间,尤其是在存在大域间隙时。为了解决这个问题,我们引入了一种高斯指导的潜在比对方法,以使两个域的潜在特征分布在先前分布的指导下保持一致。以这种间接的方式,将在公共特征空间(即先验空间)上构建两个域中样品上的分布,从而促进了更好的特征对齐。为了有效地将目标潜在分布与此先前的分布保持一致,我们还通过利用编码器的配方来提出一种新型的未配对L1距离。对九个基准数据集进行的广泛评估通过优于最先进的方法来验证优越的知识转移性,并通过显着改善现有工作来验证该方法的多功能性。

In this study, we focus on the unsupervised domain adaptation problem where an approximate inference model is to be learned from a labeled data domain and expected to generalize well to an unlabeled data domain. The success of unsupervised domain adaptation largely relies on the cross-domain feature alignment. Previous work has attempted to directly align latent features by the classifier-induced discrepancies. Nevertheless, a common feature space cannot always be learned via this direct feature alignment especially when a large domain gap exists. To solve this problem, we introduce a Gaussian-guided latent alignment approach to align the latent feature distributions of the two domains under the guidance of the prior distribution. In such an indirect way, the distributions over the samples from the two domains will be constructed on a common feature space, i.e., the space of the prior, which promotes better feature alignment. To effectively align the target latent distribution with this prior distribution, we also propose a novel unpaired L1-distance by taking advantage of the formulation of the encoder-decoder. The extensive evaluations on nine benchmark datasets validate the superior knowledge transferability through outperforming state-of-the-art methods and the versatility of the proposed method by improving the existing work significantly.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源