论文标题

来自单个图像的室内场景照明的物理编辑

Physically-Based Editing of Indoor Scene Lighting from a Single Image

论文作者

Li, Zhengqin, Shi, Jia, Bi, Sai, Zhu, Rui, Sunkavalli, Kalyan, Hašan, Miloš, Xu, Zexiang, Ramamoorthi, Ravi, Chandraker, Manmohan

论文摘要

我们提出了一种从单个图像中编辑复杂室内照明的方法,该图像的深度和光源分割面罩。这是一个极具挑战性的问题,需要对复杂的光传输进行建模,并仅通过对场景的部分LDR观察将HDR照明从材料和几何形状中解散。我们使用两个新颖的组件解决了这个问题:1)整体场景重建方法估算场景反射率和参数3D照明,以及2)一个神经渲染框架,从我们的预测中重新呈现场景。我们使用基于物理的室内光表示,可以进行直观的编辑,并推断可见和看不见的光源。我们的神经渲染框架将基于物理的直接照明和阴影渲染与深层网络相结合,以近似全球照明。它可以捕获具有挑战性的照明效果,例如柔和的阴影,定向照明,镜面材料和反射。以前的单个图像逆渲染方法通常纠缠场景照明和几何形状,仅支持对象插入等应用程序。取而代之的是,通过将参数3D照明估计与神经场景渲染相结合,我们演示了从单个图像中实现完整场景重新确定(包括光源插入,删除和替换)的第一个自动方法。所有源代码和数据都将公开发布。

We present a method to edit complex indoor lighting from a single image with its predicted depth and light source segmentation masks. This is an extremely challenging problem that requires modeling complex light transport, and disentangling HDR lighting from material and geometry with only a partial LDR observation of the scene. We tackle this problem using two novel components: 1) a holistic scene reconstruction method that estimates scene reflectance and parametric 3D lighting, and 2) a neural rendering framework that re-renders the scene from our predictions. We use physically-based indoor light representations that allow for intuitive editing, and infer both visible and invisible light sources. Our neural rendering framework combines physically-based direct illumination and shadow rendering with deep networks to approximate global illumination. It can capture challenging lighting effects, such as soft shadows, directional lighting, specular materials, and interreflections. Previous single image inverse rendering methods usually entangle scene lighting and geometry and only support applications like object insertion. Instead, by combining parametric 3D lighting estimation with neural scene rendering, we demonstrate the first automatic method to achieve full scene relighting, including light source insertion, removal, and replacement, from a single image. All source code and data will be publicly released.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源