Skip to main content

Exploiting Operation Importance for Differentiable Neural Architecture Search

Author(s): Zhou, Yuan; Xie, Xukai; Kung, Sun-Yuan

To refer to this page use:
Abstract: Recently, differentiable neural architecture search (NAS) methods have made significant progress in reducing the computational costs of NASs. Existing methods search for the best architecture by choosing candidate operations with higher architecture weights. However, architecture weights cannot accurately reflect the importance of each operation, that is, the operation with the highest weight might not be related to the best performance. To circumvent this deficiency, we propose a novel indicator that can fully represent the operation importance and, thus, serve as an effective metric to guide the model search. Based on this indicator, we further develop a NAS scheme for “exploiting operation importance for effective NAS” (EoiNAS). More precisely, we propose a high-order Markov chain-based strategy to slim the search space to further improve search efficiency and accuracy. To evaluate the effectiveness of the proposed EoiNAS, we applied our method to two tasks: image classification and semantic segmentation. Extensive experiments on both tasks provided strong evidence that our method is capable of discovering high-performance architectures while guaranteeing the requisite efficiency during searching.
Publication Date: 17-May-2021
Citation: Zhou, Yuan, Xie, Xukai, Kung, Sun-Yuan. (2022). Exploiting Operation Importance for Differentiable Neural Architecture Search. IEEE Transactions on Neural Networks and Learning Systems, 33 (11), 6235 - 6248. doi:10.1109/tnnls.2021.3072950
DOI: doi:10.1109/tnnls.2021.3072950
ISSN: 2162-237X
EISSN: 2162-2388
Pages: 6235 - 6248
Type of Material: Journal Article
Journal/Proceeding Title: IEEE Transactions on Neural Networks and Learning Systems
Version: Author's manuscript

Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.