Skip to main content

Neural Illumination: Lighting Prediction for Indoor Environments

Author(s): Song, Shuran; Funkhouser, Thomas

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr11z7k
Abstract: This paper addresses the task of estimating the light arriving from all directions to a 3D point observed at a selected pixel in an RGB image. This task is challenging because it requires predicting a mapping from a partial RGB observation by a camera to a complete illumination map for a different 3D point, which depends on the 3D location of the selected pixel, the distribution of unobserved light sources, the occlusions by scene geometry, etc. Previous methods attempt to learn this complex mapping directly using a single black-box neural network which often fails to estimate high-frequency lighting details for scenes with complicated 3D geometry. Instead, we propose "Neural Illumination," a new approach that decomposes illumination prediction into several simpler differentiable sub-tasks: 1) geometry estimation, 2) scene completion, and 3) LDR-to-HDR estimation. The advantage of this approach is that the sub-tasks are relatively easy to learn and can be trained with direct supervision, while the whole pipeline is fully differentiable and can be fine-tuned with end-to-end supervision. Experiments show that our approach performs significantly better quantitatively and qualitatively than prior work.
Publication Date: 2019
Citation: Song, Shuran, and Thomas Funkhouser. "Neural Illumination: Lighting Prediction for Indoor Environments." In IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2019): pp. 6911-6919. doi:10.1109/CVPR.2019.00708
DOI: 10.1109/CVPR.2019.00708
EISSN: 2575-7075
Pages: 6911 - 6919
Type of Material: Conference Article
Journal/Proceeding Title: IEEE/CVF Conference on Computer Vision and Pattern Recognition
Version: Author's manuscript



Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.