Gated2Depth: Real-Time Dense Lidar From Gated Images
Author(s): Gruber, Tobias; Julca-Aguilar, Frank; Bijelic, Mario; Heide, Felix
DownloadTo refer to this page use:
http://arks.princeton.edu/ark:/88435/pr1v24r
Abstract: | We present an imaging framework which converts three images from a gated camera into high-resolution depth maps with depth accuracy comparable to pulsed lidar measurements. Existing scanning lidar systems achieve low spatial resolution at large ranges due to mechanically-limited angular sampling rates, restricting scene understanding tasks to close-range clusters with dense sampling. Moreover, today's pulsed lidar scanners suffer from high cost, power consumption, large form-factors, and they fail in the presence of strong backscatter. We depart from point scanning and demonstrate that it is possible to turn a low-cost CMOS gated imager into a dense depth camera with at least 80m range - by learning depth from three gated images. The proposed architecture exploits semantic context across gated slices, and is trained on a synthetic discriminator loss without the need of dense depth labels. The proposed replacement for scanning lidar systems is real-time, handles back-scatter and provides dense depth at long ranges. We validate our approach in simulation and on real-world data acquired over 4,000km driving in northern Europe. Data and code are available at https://github.com/gruberto/Gated2Depth. |
Publication Date: | 2019 |
Citation: | Gruber, Tobias, Frank Julca-Aguilar, Mario Bijelic, and Felix Heide. "Gated2Depth: Real-Time Dense Lidar From Gated Images." In IEEE/CVF International Conference on Computer Vision (2019): pp. 1506-1516. doi:10.1109/ICCV.2019.00159 |
DOI: | doi:10.1109/ICCV.2019.00159 |
ISSN: | 1550-5499 |
EISSN: | 2380-7504 |
Pages: | 1506 - 1516 |
Type of Material: | Conference Article |
Journal/Proceeding Title: | IEEE/CVF International Conference on Computer Vision |
Version: | Author's manuscript |
Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.