Skip to main content

Steady-State Non-Line-Of-Sight Imaging

Author(s): Chen, Wenzheng; Daneau, Simon; Brosseau, Colin; Heide, Felix

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr1xv7d
Full metadata record
DC FieldValueLanguage
dc.contributor.authorChen, Wenzheng-
dc.contributor.authorDaneau, Simon-
dc.contributor.authorBrosseau, Colin-
dc.contributor.authorHeide, Felix-
dc.date.accessioned2021-10-08T19:46:48Z-
dc.date.available2021-10-08T19:46:48Z-
dc.date.issued2019en_US
dc.identifier.citationChen, Wenzheng, Simon Daneau, Fahim Mannan, and Felix Heide. "Steady-State Non-Line-Of-Sight Imaging." In Conference on Computer Vision and Pattern Recognition (2019): pp. 6783-6792. doi:10.1109/CVPR.2019.00695en_US
dc.identifier.issn1063-6919-
dc.identifier.urihttps://openaccess.thecvf.com/content_CVPR_2019/papers/Chen_Steady-State_Non-Line-Of-Sight_Imaging_CVPR_2019_paper.pdf-
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/pr1xv7d-
dc.description.abstractConventional intensity cameras recover objects in the direct line-of-sight of the camera, whereas occluded scene parts are considered lost in this process. Non-line-of-sight imaging (NLOS) aims at recovering these occluded objects by analyzing their indirect reflections on visible scene surfaces. Existing NLOS methods temporally probe the indirect light transport to unmix light paths based on their travel time, which mandates specialized instrumentation that suffers from low photon efficiency, high cost, and mechanical scanning. We depart from temporal probing and demonstrate steady-state NLOS imaging using conventional intensity sensors and continuous illumination. Instead of assuming perfectly isotropic scattering, the proposed method exploits directionality in the hidden surface reflectance, resulting in (small) spatial variation of their indirect reflections for varying illumination. To tackle the shape-dependence of these variations, we propose a trainable architecture which learns to map diffuse indirect reflections to scene reflectance using only synthetic training data. Relying on consumer color image sensors, with high fill factor, high quantum efficiency and low read-out noise, we demonstrate high-fidelity color NLOS imaging for scene configurations tackled before with picosecond time resolution.en_US
dc.format.extent6783 - 6792en_US
dc.language.isoen_USen_US
dc.relation.ispartofConference on Computer Vision and Pattern Recognitionen_US
dc.rightsAuthor's manuscripten_US
dc.titleSteady-State Non-Line-Of-Sight Imagingen_US
dc.typeConference Articleen_US
dc.identifier.doi10.1109/CVPR.2019.00695-
dc.identifier.eissn2575-7075-
pu.type.symplectichttp://www.symplectic.co.uk/publications/atom-terms/1.0/conference-proceedingen_US

Files in This Item:
File Description SizeFormat 
SteadyStateNonLineOfSightImaging.pdf3.86 MBAdobe PDFView/Download


Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.