CaImAn an open source tool for scalable calcium imaging data analysis
Author(s): Giovannucci, Andrea; Friedrich, Johannes; Gunn, Pat; Kalfon, Jérémie; Brown, Brandon L; et al
DownloadTo refer to this page use:
http://arks.princeton.edu/ark:/88435/pr18c5q
Abstract: | Advances in fluorescence microscopy enable monitoring larger brain areas in-vivo with finer time resolution. The resulting data rates require reproducible analysis pipelines that are reliable, fully automated, and scalable to datasets generated over the course of months. We present CaImAn, an open-source library for calcium imaging data analysis. CaImAn provides automatic and scalable methods to address problems common to pre-processing, including motion correction, neural activity identification, and registration across different sessions of data collection. It does this while requiring minimal user intervention, with good scalability on computers ranging from laptops to high-performance computing clusters. CaImAn is suitable for two-photon and one-photon imaging, and also enables real-time analysis on streaming data. To benchmark the performance of CaImAn we collected and combined a corpus of manual annotations from multiple labelers on nine mouse two-photon datasets. We demonstrate that CaImAn achieves near-human performance in detecting locations of active neurons. |
Publication Date: | 17-Jan-2019 |
Citation: | Giovannucci, Andrea, Friedrich, Johannes, Gunn, Pat, Kalfon, Jérémie, Brown, Brandon L, Koay, Sue Ann, Taxidis, Jiannis, Najafi, Farzaneh, Gauthier, Jeffrey L, Zhou, Pengcheng, Khakh, Baljit S, Tank, David W, Chklovskii, Dmitri B, Pnevmatikakis, Eftychios A. (2019). CaImAn an open source tool for scalable calcium imaging data analysis. eLife, 8 (10.7554/elife.38173 |
DOI: | doi:10.7554/elife.38173 |
ISSN: | 2050-084X |
EISSN: | 2050-084X |
Language: | eng |
Type of Material: | Journal Article |
Journal/Proceeding Title: | eLife |
Version: | Final published version. This is an open access article. |
Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.