论文标题
图像驱动的家具风格用于交互式3D场景建模
Image-Driven Furniture Style for Interactive 3D Scene Modeling
论文作者
论文摘要
创建逼真的样式空间是一项复杂的任务,它涉及哪种家具的设计知识融合在一起。内部样式遵循涉及颜色,几何和其他视觉元素的抽象规则。遵循此类规则,用户手动从3D家具模型的大型存储库中手动选择类似风格的项目,这一过程既费力又耗时。我们通过从内部场景图像中学习家具的样式兼容性,提出了一种快速跟踪样式相似任务的方法。此类图像包含的样式信息比描绘单一家具的图像更多。为了了解样式,我们培训了一个深入学习网络,以执行分类任务。根据从我们的网络中提取的图像嵌入,我们测量家具的风格兼容性。我们通过几个3D模型兼容性结果演示了我们的方法,并具有用于建模风格一致场景的交互式系统。
Creating realistic styled spaces is a complex task, which involves design know-how for what furniture pieces go well together. Interior style follows abstract rules involving color, geometry and other visual elements. Following such rules, users manually select similar-style items from large repositories of 3D furniture models, a process which is both laborious and time-consuming. We propose a method for fast-tracking style-similarity tasks, by learning a furniture's style-compatibility from interior scene images. Such images contain more style information than images depicting single furniture. To understand style, we train a deep learning network on a classification task. Based on image embeddings extracted from our network, we measure stylistic compatibility of furniture. We demonstrate our method with several 3D model style-compatibility results, and with an interactive system for modeling style-consistent scenes.