To refer to this page use:
|Abstract:||Substantial progress has been made recently on developing provably accurate and efficient algorithms for low-rank matrix factorization via nonconvex optimization. While conventional wisdom often takes a dim view of nonconvex optimization algorithms due to their susceptibility to spurious local minima, simple iterative methods such as gradient descent have been remarkably successful in practice. The theoretical footings, however, had been largely lacking until recently. In this tutorial-style overview, we highlight the important role of statistical models in enabling efficient nonconvex optimization with performance guarantees. We review two contrasting approaches: (1) two-stage algorithms, which consist of a tailored initialization step followed by successive refinement; and (2) global landscape analysis and initialization-free algorithms. Several canonical matrix factorization problems are discussed, including but not limited to matrix sensing, phase retrieval, matrix completion, blind deconvolution, and robust principal component analysis. Special care is taken to illustrate the key technical insights underlying their analyses. This article serves as a testament that the integrated consideration of optimization and statistics leads to fruitful research findings.|
|Citation:||Chi, Y, Lu, YM, Chen, Y. (2019). Nonconvex Optimization Meets Low-Rank Matrix Factorization: An Overview. IEEE Transactions on Signal Processing, 67 (5239 - 5269. doi:10.1109/TSP.2019.2937282|
|Pages:||5239 - 5269|
|Type of Material:||Journal Article|
|Journal/Proceeding Title:||IEEE Transactions on Signal Processing|
Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.