论文标题

Wasserstein条件生成对抗神经网络的稳定平行训练

Stable Parallel Training of Wasserstein Conditional Generative Adversarial Neural Networks

论文作者

Pasini, Massimiliano Lupo, Yin, Junqi

论文摘要

我们建议在固定计算预算的约束下,提出一种稳定的,平行的方法来训练Wasserstein条件生成对抗神经网络(W-CGANS)。与以前的分布式gan训练技术不同,我们的方法避免了过程间通信,降低了模式崩溃的风险并通过使用多个发电机来增强可扩展性,每个发电机都同时在单个数据标签上进行了培训。 Wasserstein度量的使用还通过稳定每个发电机的训练来降低骑自行车的风险。我们说明了CIFAR10,CIFAR100和IMAGENET1K数据集上的三个标准基准图像数据集上的方法,并为每个数据集维护图像的原始分辨率。在有限的固定计算时间和计算资源中,根据可伸缩性和最终准确性评估了性能。为了衡量准确性,我们使用成立得分,特征构成距离和图像质量。与以前的结果相比,通过对深卷卷积有条件生成的对抗性神经网络(DC-CGAN)进行并行方法相比,发表了启动评分和特征构成距离的改善,并提高了新图像的图像质量。在OLCF超级计算机峰会上使用多达2,000个NVIDIA V100 GPU的两个数据集都达到了弱缩放。

We propose a stable, parallel approach to train Wasserstein Conditional Generative Adversarial Neural Networks (W-CGANs) under the constraint of a fixed computational budget. Differently from previous distributed GANs training techniques, our approach avoids inter-process communications, reduces the risk of mode collapse and enhances scalability by using multiple generators, each one of them concurrently trained on a single data label. The use of the Wasserstein metric also reduces the risk of cycling by stabilizing the training of each generator. We illustrate the approach on the CIFAR10, CIFAR100, and ImageNet1k datasets, three standard benchmark image datasets, maintaining the original resolution of the images for each dataset. Performance is assessed in terms of scalability and final accuracy within a limited fixed computational time and computational resources. To measure accuracy, we use the inception score, the Frechet inception distance, and image quality. An improvement in inception score and Frechet inception distance is shown in comparison to previous results obtained by performing the parallel approach on deep convolutional conditional generative adversarial neural networks (DC-CGANs) as well as an improvement of image quality of the new images created by the GANs approach. Weak scaling is attained on both datasets using up to 2,000 NVIDIA V100 GPUs on the OLCF supercomputer Summit.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源