论文标题

您的gan秘密地是一种基于能量的模型,您应该使用歧视驱动的潜在采样

Your GAN is Secretly an Energy-based Model and You Should use Discriminator Driven Latent Sampling

论文作者

Che, Tong, Zhang, Ruixiang, Sohl-Dickstein, Jascha, Larochelle, Hugo, Paull, Liam, Cao, Yuan, Bengio, Yoshua

论文摘要

我们表明,具有鉴别器的logit分数的隐式生成器日志密度$ \ log p_g $的总和定义了一个能量函数,当发电机不完美时,该函数可产生真实的数据密度,但鉴别器是最佳的,因此可以改善典型生成器(具有隐含密度$ p_g $ p_g $)。为了使这一实用性,我们表明,根据由潜在的先前对数密度和鉴别器输出得分的总和引起的基于能量的模型,可以通过在潜在空间中取样来实现从这种修改的密度进行采样。这可以通过在潜在空间中运行langevin MCMC,然后应用发电机函数来实现,我们称之为歧视器驱动的潜在采样〜(DDLS)。我们表明,与以前在高维像素空间中起作用的方法相比,DDLs效率很高,可以应用于以前训练的多种类型的gans。我们在定性和定量上评估合成和现实数据集的DDL。在CIFAR-10上,DDLS显着提高了现成的预先训练的SN-GAN〜\ citep {sngan}的成立从$ 8.22 $提高到$ 9.09 $,甚至与课堂条件biggan〜 \ citep {biggan}模型相媲美。这在无条件的图像合成设置中实现了新的最新技术,而无需引入额外的参数或其他培训。

We show that the sum of the implicit generator log-density $\log p_g$ of a GAN with the logit score of the discriminator defines an energy function which yields the true data density when the generator is imperfect but the discriminator is optimal, thus making it possible to improve on the typical generator (with implicit density $p_g$). To make that practical, we show that sampling from this modified density can be achieved by sampling in latent space according to an energy-based model induced by the sum of the latent prior log-density and the discriminator output score. This can be achieved by running a Langevin MCMC in latent space and then applying the generator function, which we call Discriminator Driven Latent Sampling~(DDLS). We show that DDLS is highly efficient compared to previous methods which work in the high-dimensional pixel space and can be applied to improve on previously trained GANs of many types. We evaluate DDLS on both synthetic and real-world datasets qualitatively and quantitatively. On CIFAR-10, DDLS substantially improves the Inception Score of an off-the-shelf pre-trained SN-GAN~\citep{sngan} from $8.22$ to $9.09$ which is even comparable to the class-conditional BigGAN~\citep{biggan} model. This achieves a new state-of-the-art in unconditional image synthesis setting without introducing extra parameters or additional training.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源