论文标题

高分辨率图像合成的任何分辨率培训

Any-resolution Training for High-resolution Image Synthesis

论文作者

Chai, Lucy, Gharbi, Michael, Shechtman, Eli, Isola, Phillip, Zhang, Richard

论文摘要

即使自然图像有多种尺寸,生成模型也以固定分辨率运行。由于高分辨率的细节被删除并完全丢弃了低分辨率的图像,因此丢失了宝贵的监督。我们认为,每个像素都很重要,并创建具有可变大小图像的数据集,该图像以其本机分辨率收集。为了利用各种大小的数据,我们引入了连续尺度训练,该过程以随机尺度进行采样以训练具有可变输出分辨率的新发电机。首先,在目标尺度上调节发电机可以使我们能够生成比以前可能的更高的分辨率图像,而无需在模型中添加层。其次,通过在连续坐标上进行条件,我们可以采样仍然遵守一致的全局布局的贴片,这也允许在较高分辨率下进行可扩展的训练。受控的FFHQ实验表明,我们的方法可以比离散的多尺度方法更好地利用多分辨率培训数据,从而获得更好的FID分数和更清洁的高频细节。我们还训练了其他自然图像领域,包括教堂,山脉和鸟类,并展示了任意规模的综合,并具有连贯的全球布局和现实的本地细节,超出了我们的实验中的2K分辨率。我们的项目页面可在以下网址找到:https://chail.github.io/anyres-gan/。

Generative models operate at fixed resolution, even though natural images come in a variety of sizes. As high-resolution details are downsampled away and low-resolution images are discarded altogether, precious supervision is lost. We argue that every pixel matters and create datasets with variable-size images, collected at their native resolutions. To take advantage of varied-size data, we introduce continuous-scale training, a process that samples patches at random scales to train a new generator with variable output resolutions. First, conditioning the generator on a target scale allows us to generate higher resolution images than previously possible, without adding layers to the model. Second, by conditioning on continuous coordinates, we can sample patches that still obey a consistent global layout, which also allows for scalable training at higher resolutions. Controlled FFHQ experiments show that our method can take advantage of multi-resolution training data better than discrete multi-scale approaches, achieving better FID scores and cleaner high-frequency details. We also train on other natural image domains including churches, mountains, and birds, and demonstrate arbitrary scale synthesis with both coherent global layouts and realistic local details, going beyond 2K resolution in our experiments. Our project page is available at: https://chail.github.io/anyres-gan/.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源