Skip to main content

A survey on computational spectral reconstruction methods from RGB to hyperspectral imaging

Author(s): Zhang, Jingang; Su, Runmu; Fu, Qiang; Ren, Wenqi; Heide, Felix; et al

To refer to this page use:
Abstract: Hyperspectral imaging enables many versatile applications for its competence in capturing abundant spatial and spectral information, which is crucial for identifying substances. However, the devices for acquiring hyperspectral images are typically expensive and very complicated, hindering the promotion of their application in consumer electronics, such as daily food inspection and point-of-care medical screening, etc. Recently, many computational spectral imaging methods have been proposed by directly reconstructing the hyperspectral information from widely available RGB images. These reconstruction methods can exclude the usage of burdensome spectral camera hardware while keeping a high spectral resolution and imaging performance. We present a thorough investigation of more than 25 state-of-the-art spectral reconstruction methods which are categorized as prior-based and data-driven methods. Simulations on open-source datasets show that prior-based methods are more suitable for rare data situations, while data-driven methods can unleash the full potential of deep learning in big data cases. We have identified current challenges faced by those methods (e.g., loss function, spectral accuracy, data generalization) and summarized a few trends for future work. With the rapid expansion in datasets and the advent of more advanced neural networks, learnable methods with fine feature representation abilities are very promising. This comprehensive review can serve as a fruitful reference source for peer researchers, thus paving the way for the development of computational hyperspectral imaging.
Publication Date: 13-Jul-2022
DOI: 10.1038/s41598-022-16223-1
ISSN: 2045-2322
Language: en
Type of Material: Journal Article
Journal/Proceeding Title: Scientific Reports
Version: Final published version. This is an open access article.

Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.