Skip to main content

Optimal Feature Selection in High-Dimensional Discriminant Analysis

Author(s): Kolar, Mladen; Liu, Han

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr1ns2t
Full metadata record
DC FieldValueLanguage
dc.contributor.authorKolar, Mladen-
dc.contributor.authorLiu, Han-
dc.date.accessioned2021-10-11T14:16:59Z-
dc.date.available2021-10-11T14:16:59Z-
dc.date.issued2015-02en_US
dc.identifier.citationKolar, Mladen, and Han Liu. "Optimal feature selection in high-dimensional discriminant analysis." IEEE transactions on information theory 61, no. 2 (2014): 1063-1083.en_US
dc.identifier.issn0018-9448-
dc.identifier.urihttps://arxiv.org/abs/1306.6557-
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/pr1ns2t-
dc.description.abstractWe consider the high-dimensional discriminant analysis problem. For this problem, different methods have been proposed and justified by establishing exact convergence rates for the classification risk, as well as the ℓ 2 convergence results to the discriminative rule. However, sharp theoretical analysis for the variable selection performance of these procedures have not been established, even though model interpretation is of fundamental importance in scientific data analysis. This paper bridges the gap by providing sharp sufficient conditions for consistent variable selection using the sparse discriminant analysis. Through careful analysis, we establish rates of convergence that are significantly faster than the best known results and admit an optimal scaling of the sample size n, dimensionality p, and sparsity level s in the high-dimensional setting. Sufficient conditions are complemented by the necessary information theoretic limits on the variable selection problem in the context of high-dimensional discriminant analysis. Exploiting a numerical equivalence result, our method also establish the optimal results for the ROAD estimator and the sparse optimal scoring estimator. Furthermore, we analyze an exhaustive search procedure, whose performance serves as a benchmark, and show that it is variable selection consistent under weaker conditions. Extensive simulations demonstrating the sharpness of the bounds are also provided.en_US
dc.format.extent1063 - 1083en_US
dc.language.isoen_USen_US
dc.relation.ispartofIEEE Transactions on Information Theoryen_US
dc.rightsAuthor's manuscripten_US
dc.titleOptimal Feature Selection in High-Dimensional Discriminant Analysisen_US
dc.typeJournal Articleen_US
dc.identifier.doidoi:10.1109/TIT.2014.2381241-
dc.identifier.eissn1557-9654-
pu.type.symplectichttp://www.symplectic.co.uk/publications/atom-terms/1.0/journal-articleen_US

Files in This Item:
File Description SizeFormat 
OptimalFeatureSelectAnalysis.pdf471.55 kBAdobe PDFView/Download


Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.