Skip to main content

Recurrent Network Models of Sequence Generation and Memory

Author(s): Rajan, Kanaka; Harvey, Christopher D; Tank, David W

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr1p843w1q
Abstract: Sequential activation of neurons is a common feature of network activity during a variety of behaviors, including working memory and decision making. Previous network models for sequences and memory emphasized specialized architectures in which a principled mechanism is pre-wired into their connectivity. Here, we demonstrate that starting from random connectivity and modifying a small fraction of connections, a largely disordered recurrent network can produce sequences and implement working memory efficiently. We use this process, called Partial InNetwork training (PINning), to model and match cellular-resolution imaging data from the posterior parietal cortex during a virtual memory-guided two-alternative forced choice task [Harvey, Coen and Tank, 2012]. Analysis of the connectivity reveals that sequences propagate by the cooperation between recurrent synaptic interactions and external inputs, rather than through feedforward or asymmetric connections. Together our results suggest that neural sequences may emerge through learning from largely unstructured network architectures.
Publication Date: 6-Apr-2016
Citation: Rajan, Kanaka, Harvey, Christopher D, Tank, David W. (2016). Recurrent Network Models of Sequence Generation and Memory. Neuron, 90 (1), 128 - 142. doi:10.1016/j.neuron.2016.02.009
DOI: doi:10.1016/j.neuron.2016.02.009
ISSN: 0896-6273
Pages: 128 - 142
Type of Material: Journal Article
Journal/Proceeding Title: Neuron
Version: Author's manuscript



Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.