Skip to main content

Recursive training of 2D-3D convolutional networks for neuronal boundary detection

Author(s): Lee, K; Zlateski, A; Vishwanathan, A; Seung, H. Sebastian

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr1mq19
Abstract: Efforts to automate the reconstruction of neural circuits from 3D electron microscopic (EM) brain images are critical for the field of connectomics. An important computation for reconstruction is the detection of neuronal boundaries. Images acquired by serial section EM, a leading 3D EM technique, are highly anisotropic, with inferior quality along the third dimension. For such images, the 2D maxpooling convolutional network has set the standard for performance at boundary detection. Here we achieve a substantial gain in accuracy through three innovations. Following the trend towards deeper networks for object recognition, we use a much deeper network than previously employed for boundary detection. Second, we incorporate 3D as well as 2D filters, to enable computations that use 3D context. Finally, we adopt a recursively trained architecture in which a first network generates a preliminary boundary map that is provided as input along with the original image to a second network that generates a final boundary map. Backpropagation training is accelerated by ZNN, a new implementation of 3D convolutional networks that uses multicore CPU parallelism for speed. Our hybrid 2D-3D architecture could be more generally applicable to other types of anisotropic 3D images, including video, and our recursive framework for any image labeling problem.
Publication Date: 2015
Electronic Publication Date: 2015
Citation: Lee, K, Zlateski, A, Vishwanathan, A, Seung, HS. (2015). Recursive training of 2D-3D convolutional networks for neuronal boundary detection. 2015-January (3573 - 3581
Pages: 3573 - 3581
Type of Material: Conference Article
Journal/Proceeding Title: Advances in Neural Information Processing Systems
Version: Author's manuscript



Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.