Skip to main content

NeST: A Neural Network Synthesis Tool Based on a Grow-and-Prune Paradigm

Author(s): Dai, Xiaoliang; Yin, Hongxu; Jha, Niraj K

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr1m61bp9d
Abstract: Deep neural networks (DNNs) have begun to have a pervasive impact on various applications of machine learning. However, the problem of finding an optimal DNN architecture for large applications is challenging. Common approaches go for deeper and larger DNN architectures but may incur substantial redundancy. To address these problems, we introduce a network growth algorithm that complements network pruning to learn both weights and compact DNN architectures during training. We propose a DNN synthesis tool (NeST) that combines both methods to automate the generation of compact and accurate DNNs. NeST starts with a randomly initialized sparse network called the seed architecture. It iteratively tunes the architecture with gradient-based growth and magnitude-based pruning of neurons and connections. Our experimental results show that NeST yields accurate, yet very compact DNNs, with a wide range of seed architecture selection. For the LeNet-300-100 (LeNet-5) architecture, we reduce network parameters by 70.2× (74.3×) and floating-point operations (FLOPs) by 79.4× (43.7×). For the AlexNet, VGG-16, and ResNet-50 architectures, we reduce network parameters (FLOPs) by 15.7× (4.6×), 33.2× (8.9×), and 4.1× (2.1×) respectively. NeST's grow-and-prune paradigm delivers significant additional parameter and FLOPs reduction relative to pruning-only methods.
Publication Date: 2-May-2019
Citation: Dai, Xiaoliang, Yin, Hongxu, Jha, Niraj K. (2019). NeST: A Neural Network Synthesis Tool Based on a Grow-and-Prune Paradigm. IEEE Transactions on Computers, 68 (1487 - 1497. doi:10.1109/TC.2019.2914438
DOI: doi:10.1109/TC.2019.2914438
Pages: 1487 - 1497
Type of Material: Journal Article
Journal/Proceeding Title: IEEE Transactions on Computers
Version: Author's manuscript



Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.