Skip to main content

Steady-State Non-Line-Of-Sight Imaging

Author(s): Chen, Wenzheng; Daneau, Simon; Brosseau, Colin; Heide, Felix

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr1xv7d
Abstract: Conventional intensity cameras recover objects in the direct line-of-sight of the camera, whereas occluded scene parts are considered lost in this process. Non-line-of-sight imaging (NLOS) aims at recovering these occluded objects by analyzing their indirect reflections on visible scene surfaces. Existing NLOS methods temporally probe the indirect light transport to unmix light paths based on their travel time, which mandates specialized instrumentation that suffers from low photon efficiency, high cost, and mechanical scanning. We depart from temporal probing and demonstrate steady-state NLOS imaging using conventional intensity sensors and continuous illumination. Instead of assuming perfectly isotropic scattering, the proposed method exploits directionality in the hidden surface reflectance, resulting in (small) spatial variation of their indirect reflections for varying illumination. To tackle the shape-dependence of these variations, we propose a trainable architecture which learns to map diffuse indirect reflections to scene reflectance using only synthetic training data. Relying on consumer color image sensors, with high fill factor, high quantum efficiency and low read-out noise, we demonstrate high-fidelity color NLOS imaging for scene configurations tackled before with picosecond time resolution.
Publication Date: 2019
Citation: Chen, Wenzheng, Simon Daneau, Fahim Mannan, and Felix Heide. "Steady-State Non-Line-Of-Sight Imaging." In Conference on Computer Vision and Pattern Recognition (2019): pp. 6783-6792. doi:10.1109/CVPR.2019.00695
DOI: 10.1109/CVPR.2019.00695
ISSN: 1063-6919
EISSN: 2575-7075
Pages: 6783 - 6792
Type of Material: Conference Article
Journal/Proceeding Title: Conference on Computer Vision and Pattern Recognition
Version: Author's manuscript



Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.