Skip to main content

High dimensional EM algorithm: Statistical optimization and asymptotic normality

Author(s): Wang, Z; Gu, Q; Ning, Y; Liu, H

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr17p3z
Abstract: We provide a general theory of the expectation-maximization (EM) algorithm for inferring high dimensional latent variable models. In particular, we make two contributions: (i) For parameter estimation, we propose a novel high dimensional EM algorithm which naturally incorporates sparsity structure into parameter estimation. With an appropriate initialization, this algorithm converges at a geometric rate and attains an estimator with the (near-)optimal statistical rate of convergence. (ii) Based on the obtained estimator, we propose a new inferential procedure for testing hypotheses for low dimensional components of high dimensional parameters. For a broad family of statistical models, our framework establishes the first computationally feasible approach for optimal estimation and asymptotic inference in high dimensions.
Publication Date: 2015
Citation: Wang, Zhaoran, Quanquan Gu, Yang Ning, and Han Liu. "High dimensional em algorithm: Statistical optimization and asymptotic normality." In Advances in neural information processing systems, (2015): pp. 2521-2529.
ISSN: 1049-5258
Pages: 2521 - 2529
Type of Material: Conference Article
Journal/Proceeding Title: Advances in Neural Information Processing Systems
Version: Author's manuscript



Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.