论文标题

HSIC-INFOGAN:通过最大化近似的共同信息来学习无监督的分解表示

HSIC-InfoGAN: Learning Unsupervised Disentangled Representations by Maximising Approximated Mutual Information

论文作者

Liu, Xiao, Thermos, Spyridon, Sanchez, Pedro, O'Neil, Alison Q., Tsaftaris, Sotirios A.

论文摘要

学习分解的表示形式需要监督或引入特定模型设计和学习限制作为偏见。 Infogan是一个流行的分离框架,通过最大化潜在表示及其相应生成的图像之间的相互信息来学习无监督的分解表示形式。通过引入辅助网络和潜在回归损失的培训来实现共同信息的最大化。在这篇简短的探索性论文中,我们研究了希尔伯特 - 史密特独立标准(HSIC)的使用,以近似于潜在表示和图像之间的相互信息,称为HSIC-INFOGAN。直接优化HSIC损失可以避免需要额外的辅助网络。我们定性地比较了每个模型中的分离水平,提出了一种调整HSIC-INFOGAN超参数的策略,并讨论HSIC-INFOGAN在医疗应用中的潜力。

Learning disentangled representations requires either supervision or the introduction of specific model designs and learning constraints as biases. InfoGAN is a popular disentanglement framework that learns unsupervised disentangled representations by maximising the mutual information between latent representations and their corresponding generated images. Maximisation of mutual information is achieved by introducing an auxiliary network and training with a latent regression loss. In this short exploratory paper, we study the use of the Hilbert-Schmidt Independence Criterion (HSIC) to approximate mutual information between latent representation and image, termed HSIC-InfoGAN. Directly optimising the HSIC loss avoids the need for an additional auxiliary network. We qualitatively compare the level of disentanglement in each model, suggest a strategy to tune the hyperparameters of HSIC-InfoGAN, and discuss the potential of HSIC-InfoGAN for medical applications.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源