Skip to main content

Gradient descent with random initialization: fast global convergence for nonconvex phase retrieval

Author(s): Chen, Yuxin; Chi, Y; Fan, Jianqing; Ma, C

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr1bw1f
Full metadata record
DC FieldValueLanguage
dc.contributor.authorChen, Yuxin-
dc.contributor.authorChi, Y-
dc.contributor.authorFan, Jianqing-
dc.contributor.authorMa, C-
dc.date.accessioned2021-10-11T14:17:36Z-
dc.date.available2021-10-11T14:17:36Z-
dc.date.issued2019-07-01en_US
dc.identifier.citationChen, Y, Chi, Y, Fan, J, Ma, C. (2019). Gradient descent with random initialization: fast global convergence for nonconvex phase retrieval. Mathematical Programming, 176 (1-2), 5 - 37. doi:10.1007/s10107-019-01363-6en_US
dc.identifier.issn0025-5610-
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/pr1bw1f-
dc.description.abstract© 2019, Springer-Verlag GmbH Germany, part of Springer Nature and Mathematical Optimization Society. This paper considers the problem of solving systems of quadratic equations, namely, recovering an object of interest x♮∈ Rn from m quadratic equations/samples yi=(ai⊤x♮)2,1≤i≤m. This problem, also dubbed as phase retrieval, spans multiple domains including physical sciences and machine learning. We investigate the efficacy of gradient descent (or Wirtinger flow) designed for the nonconvex least squares problem. We prove that under Gaussian designs, gradient descent—when randomly initialized—yields an ϵ-accurate solution in O(log n+ log (1 / ϵ)) iterations given nearly minimal samples, thus achieving near-optimal computational and sample complexities at once. This provides the first global convergence guarantee concerning vanilla gradient descent for phase retrieval, without the need of (i) carefully-designed initialization, (ii) sample splitting, or (iii) sophisticated saddle-point escaping schemes. All of these are achieved by exploiting the statistical models in analyzing optimization algorithms, via a leave-one-out approach that enables the decoupling of certain statistical dependency between the gradient descent iterates and the data.en_US
dc.format.extent5 - 37en_US
dc.language.isoen_USen_US
dc.relation.ispartofMathematical Programmingen_US
dc.rightsAuthor's manuscripten_US
dc.titleGradient descent with random initialization: fast global convergence for nonconvex phase retrievalen_US
dc.typeJournal Articleen_US
dc.identifier.doidoi:10.1007/s10107-019-01363-6-
dc.identifier.eissn1436-4646-
pu.type.symplectichttp://www.symplectic.co.uk/publications/atom-terms/1.0/journal-articleen_US

Files in This Item:
File Description SizeFormat 
Gradient descent with random initialization fast global convergence for nonconvex phase retrieval.pdf4.55 MBAdobe PDFView/Download


Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.