Skip to main content

Low-rank and sparse structure pursuit via alternating minimization

Author(s): Gu, Quanquan; Wang, Zhaoran; Liu, Han

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr1pr51
Abstract: In this paper, we present a nonconvex alternating minimization optimization algorithm for low-rank and sparse structure pursuit. Compared with convex relaxation based methods, the proposed algorithm is computationally more efficient for large scale problems. In our study, we define a notion of bounded difference of gradients, based on which we rigorously prove that with suitable initialization, the proposed nonconvex optimization algorithm enjoys linear convergence to the global optima and exactly recovers the underlying low rank and sparse matrices under standard conditions such as incoherence and sparsity conditions. For a wide range of statistical models such as multi-task learning and robust principal component analysis (RPCA), our algorithm provides a principled approach to learning the low rank and sparse structures with provable guarantee. Thorough experiments on both synthetic and real datasets backup our theory.
Publication Date: 2016
Citation: Gu, Quanquan, Zhaoran Wang, and Han Liu. "Low-rank and sparse structure pursuit via alternating minimization." In Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, 51 (2016): pp. 600-609.
ISSN: 2640-3498
Pages: 600 - 609
Type of Material: Conference Article
Series/Report no.: Proceedings of Machine Learning Research;
Journal/Proceeding Title: Proceedings of the 19th International Conference on Artificial Intelligence and Statistics
Version: Final published version. Article is made available in OAR by the publisher's permission or policy.



Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.