论文标题

Fakeclr:探索对比的学习,以解决数据有效gan的潜在不连续性

FakeCLR: Exploring Contrastive Learning for Solving Latent Discontinuity in Data-Efficient GANs

论文作者

Li, Ziqiang, Wang, Chaoyue, Zheng, Heliang, Zhang, Jing, Li, Bin

论文摘要

旨在学习具有有限培训数据的生成模型的数据效率gan(DE-GAN)遇到了生成高质量样本的几个挑战。由于数据增强策略在很大程度上已经减轻了培训的不稳定性,因此如何进一步改善De-Gans的生成性能成为热点。最近,对比学习表明,提高了DE-GAN的合成质量的巨大潜力,但相关原则并未得到很好的探索。在本文中,我们对De-Gans中的不同对比度学习策略进行了比较,并确定(i)当前生成性能的瓶颈是潜在空间的不连续性; (ii)与其他对比的学习策略相比,实例扰动可用于潜在空间连续性,这为De-Gans带来了重大改进。基于这些观察结果,我们提出了FakeClr,该观察只在扰动的假样品上应用对比度学习,并设计了三种相关的培训技术:与噪声​​相关的潜在增强,多样性意识到的排队和排队的遗忘因素。我们的实验结果表明了几乎没有发电和有限数据的新艺术状态。在多个数据集上,与现有DE-GAN相比,FakeCLR的FID提高了15%以上。代码可在https://github.com/iceli1007/fakeclr上找到。

Data-Efficient GANs (DE-GANs), which aim to learn generative models with a limited amount of training data, encounter several challenges for generating high-quality samples. Since data augmentation strategies have largely alleviated the training instability, how to further improve the generative performance of DE-GANs becomes a hotspot. Recently, contrastive learning has shown the great potential of increasing the synthesis quality of DE-GANs, yet related principles are not well explored. In this paper, we revisit and compare different contrastive learning strategies in DE-GANs, and identify (i) the current bottleneck of generative performance is the discontinuity of latent space; (ii) compared to other contrastive learning strategies, Instance-perturbation works towards latent space continuity, which brings the major improvement to DE-GANs. Based on these observations, we propose FakeCLR, which only applies contrastive learning on perturbed fake samples, and devises three related training techniques: Noise-related Latent Augmentation, Diversity-aware Queue, and Forgetting Factor of Queue. Our experimental results manifest the new state of the arts on both few-shot generation and limited-data generation. On multiple datasets, FakeCLR acquires more than 15% FID improvement compared to existing DE-GANs. Code is available at https://github.com/iceli1007/FakeCLR.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源