论文标题

Drapenet:服装生成和自我监督的悬垂

DrapeNet: Garment Generation and Self-Supervised Draping

论文作者

De Luigi, Luca, Li, Ren, Guillard, Benoît, Salzmann, Mathieu, Fua, Pascal

论文摘要

最近在任意人体上迅速使用悬垂服装的方法利用自我训练来消除对大型训练集的需求。但是,它们旨在训练每个服装项目的一个网络,这严重限制了其概括能力。在我们的工作中,我们依靠自我训练来训练一个网络以披上多件衣服。这是通过预测在生成网络的潜在代码上的3D变形场来实现的,该网络将服装模型为未签名的距离场。我们的管道可以生成和悬垂以前没有任何拓扑的服装,它们的形状可以通过操纵其潜在代码来编辑。我们的配方完全可区分,可以通过梯度下降从部分观察结果(图像或3D扫描)中恢复精确的3D服装模型。我们的代码可在https://github.com/liren2515/drapenet上公开获取。

Recent approaches to drape garments quickly over arbitrary human bodies leverage self-supervision to eliminate the need for large training sets. However, they are designed to train one network per clothing item, which severely limits their generalization abilities. In our work, we rely on self-supervision to train a single network to drape multiple garments. This is achieved by predicting a 3D deformation field conditioned on the latent codes of a generative network, which models garments as unsigned distance fields. Our pipeline can generate and drape previously unseen garments of any topology, whose shape can be edited by manipulating their latent codes. Being fully differentiable, our formulation makes it possible to recover accurate 3D models of garments from partial observations -- images or 3D scans -- via gradient descent. Our code is publicly available at https://github.com/liren2515/DrapeNet .

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源