To refer to this page use:
|Abstract:||We provide a general theory of the expectation-maximization (EM) algorithm for inferring high dimensional latent variable models. In particular, we make two contributions: (i) For parameter estimation, we propose a novel high dimensional EM algorithm which naturally incorporates sparsity structure into parameter estimation. With an appropriate initialization, this algorithm converges at a geometric rate and attains an estimator with the (near-)optimal statistical rate of convergence. (ii) Based on the obtained estimator, we propose a new inferential procedure for testing hypotheses for low dimensional components of high dimensional parameters. For a broad family of statistical models, our framework establishes the first computationally feasible approach for optimal estimation and asymptotic inference in high dimensions.|
|Citation:||Wang, Zhaoran, Quanquan Gu, Yang Ning, and Han Liu. "High dimensional em algorithm: Statistical optimization and asymptotic normality." In Advances in neural information processing systems, (2015): pp. 2521-2529.|
|Pages:||2521 - 2529|
|Type of Material:||Conference Article|
|Journal/Proceeding Title:||Advances in Neural Information Processing Systems|
Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.