Skip to main content

Diffusion Approximations for Online Principal Component Estimation and Global Convergence

Author(s): Li, Chris Junchi; Wang, Mengdi; Liu, Han; Zhang, Tong

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr1wv1k
Abstract: In this paper, we propose to adopt the diffusion approximation tools to study the dynamics of Oja's iteration which is an online stochastic gradient method for the principal component analysis. Oja's iteration maintains a running estimate of the true principal component from streaming data and enjoys less temporal and spatial complexities. We show that the Oja's iteration for the top eigenvector generates a continuous-state discrete-time Markov chain over the unit sphere. We characterize the Oja's iteration in three phases using diffusion approximation and weak convergence tools. Our three-phase analysis further provides a finite-sample error bound for the running estimate, which matches the minimax information lower bound for PCA under the additional assumption of bounded samples.
Publication Date: 2017
Citation: Li, Chris Junchi, Mengdi Wang, Han Liu, and Tong Zhang. "Diffusion approximations for online principal component estimation and global convergence." In Advances in Neural Information Processing Systems, (2017): 645-655.
ISSN: 1049-5258
Pages: 646 - 656
Type of Material: Conference Article
Journal/Proceeding Title: 31st Conference on Neural Information Processing Systems (NIPS 2017)
Version: Author's manuscript



Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.