Loading [Contrib]/a11y/accessibility-menu.js
Skip to main content

Correcting motion induced fluorescence artifacts in two-channel neural imaging

Author(s): Creamer, Matthew S; Chen, Kevin S; Leifer, Andrew M; Pillow, Jonathan W

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr1m61bq49
Abstract: Imaging neural activity in a behaving animal presents unique challenges in part because motion from an animal’s movement creates artifacts in fluorescence intensity time-series that are difficult to distinguish from neural signals of interest. One approach to mitigating these artifacts is to image two channels simultaneously: one that captures an activity-dependent fluorophore, such as GCaMP, and another that captures an activity-independent fluorophore such as RFP. Because the activity-independent channel contains the same motion artifacts as the activity-dependent channel, but no neural signals, the two together can be used to identify and remove the artifacts. However, existing approaches for this correction, such as taking the ratio of the two channels, do not account for channel-independent noise in the measured fluorescence. Here, we present Two-channel Motion Artifact Correction (TMAC), a method which seeks to remove artifacts by specifying a generative model of the two channel fluorescence that incorporates motion artifact, neural activity, and noise. We use Bayesian inference to infer latent neural activity under this model, thus reducing the motion artifact present in the measured fluorescence traces. We further present a novel method for evaluating ground-truth performance of motion correction algorithms by comparing the decodability of behavior from two types of neural recordings; a recording that had both an activity-dependent fluorophore and an activity-independent fluorophore (GCaMP and RFP) and a recording where both fluorophores were activity-independent (GFP and RFP). A successful motion correction method should decode behavior from the first type of recording, but not the second. We use this metric to systematically compare five models for removing motion artifacts from fluorescent time traces. We decode locomotion from a GCaMP expressing animal 20x more accurately on average than from control when using TMAC inferred activity and outperforms all other methods of motion correction tested, the best of which were ~8x more accurate than control.
Publication Date: 28-Sep-2022
Electronic Publication Date: 28-Sep-2022
Citation: Creamer, Matthew S, Chen, Kevin S, Leifer, Andrew M, Pillow, Jonathan W. (Correcting motion induced fluorescence artifacts in two-channel neural imaging. PLOS Computational Biology, 18 (9), e1010421 - e1010421. doi:10.1371/journal.pcbi.1010421
DOI: doi:10.1371/journal.pcbi.1010421
EISSN: 1553-7358
Pages: e1010421 - e1010421
Language: en
Type of Material: Journal Article
Journal/Proceeding Title: PLOS Computational Biology
Version: Final published version. This is an open access article.



Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.