Skip to main content

A nonconvex optimization framework for low rank matrix estimation

Author(s): Zhao, Tuo; Wang, Zhaoran; Liu, Han

To refer to this page use:
Abstract: We study the estimation of low rank matrices via nonconvex optimization. Compared with convex relaxation, nonconvex optimization exhibits superior empirical performance for large scale instances of low rank matrix estimation. However, the understanding of its theoretical guarantees are limited. In this paper, we define the notion of projected oracle divergence based on which we establish sufficient conditions for the success of nonconvex optimization. We illustrate the consequences of this general framework for matrix sensing. In particular, we prove that a broad class of nonconvex optimization algorithms, including alternating minimization and gradient-type methods, geometrically converge to the global optimum and exactly recover the true low rank matrices under standard conditions.
Publication Date: 1-Jan-2015
Citation: Zhao, T., Wang, Z., & Liu, H. (2015). A Nonconvex Optimization Framework for Low Rank Matrix Estimation. Advances in neural information processing systems, 28, 559–567.
ISSN: 1049-5258
Pages: 559 - 567
Type of Material: Journal Article
Journal/Proceeding Title: Advances in Neural Information Processing Systems
Version: Author's manuscript

Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.