Skip to main content

Principal component analysis on non-gaussian dependent data

Author(s): Han, F; Liu, H

To refer to this page use:
Abstract: In this paper, we analyze the performance of a semiparametric principal component analysis named Copula Component Analysis (COCA) (Han & Liu, 2012) when the data are dependent. The semiparametric model assumes that, after unspecified marginally monotone transformations, the distributions are multivariate Gaussian. We study the scenario where the observations are drawn from non-i.i.d. processes (m-dependency or a more general ϕ-mixing case). We show that COCA can allow weak dependence. In particular, we provide the generalization bounds of convergence for both support recovery and parameter estimation of COCA for the dependent data. We provide explicit sufficient conditions on the degree of dependence, under which the parametric rate can be maintained. To our knowledge, this is the first work analyzing the theoretical performance of PCA for the dependent data in high dimensional settings. Our results strictly generalize the analysis in Han & Liu (2012) and the techniques we used have the separate interest for analyzing a variety of other multivariate statistical methods.
Publication Date: 2013
Citation: Han, Fang, and Han Liu. "Principal component analysis on non-Gaussian dependent data." In International Conference on Machine Learning 1, 28 (2013): pp. 240-248.
ISSN: 2640-3498
Pages: 240 - 248
Type of Material: Conference Article
Journal/Proceeding Title: 30th International Conference on Machine Learning, ICML
Version: Author's manuscript

Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.