To refer to this page use:
http://arks.princeton.edu/ark:/88435/pr1000q
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Li, X | - |
dc.contributor.author | Zhao, T | - |
dc.contributor.author | Arora, R | - |
dc.contributor.author | Liu, H | - |
dc.contributor.author | Hong, M | - |
dc.date.accessioned | 2021-10-11T14:16:50Z | - |
dc.date.available | 2021-10-11T14:16:50Z | - |
dc.date.issued | 2016 | en_US |
dc.identifier.citation | Li, Xingguo, Tuo Zhao, Raman Arora, Han Liu, and Mingyi Hong. "An improved convergence analysis of cyclic block coordinate descent-type methods for strongly convex minimization." Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, PMLR, 51 (2016): 491-499. | en_US |
dc.identifier.uri | http://proceedings.mlr.press/v51/li16c.html | - |
dc.identifier.uri | http://arks.princeton.edu/ark:/88435/pr1000q | - |
dc.description.abstract | The cyclic block coordinate descent-type (CBCD-type) methods have shown remarkable computational performance for solving strongly convex minimization problems. Typical applications include many popular statistical machine learning methods such as elastic-net regression, ridge penalized logistic regression, and sparse additive regression. Existing optimization literature has shown that the CBCD-type methods attain iteration complexity of O(p⋅\log(1/ε)), where εis a pre-specified accuracy of the objective value, and p is the number of blocks. However, such iteration complexity explicitly depends on p, and therefore is at least p times worse than those of gradient descent methods. To bridge this theoretical gap, we propose an improved convergence analysis for the CBCD-type methods. In particular, we first show that for a family of quadratic minimization problems, the iteration complexity of the CBCD-type methods matches that of the GD methods in term of dependency on p (up to a \log^2 p factor). Thus our complexity bounds are sharper than the existing bounds by at least a factor of p/\log^2p. We also provide a lower bound to confirm that our improved complexity bounds are tight (up to a \log^2 p factor) if the largest and smallest eigenvalues of the Hessian matrix do not scale with p. Finally, we generalize our analysis to other strongly convex minimization problems beyond quadratic ones. | en_US |
dc.format.extent | 491 - 499 | en_US |
dc.language.iso | en_US | en_US |
dc.relation.ispartof | Proceedings of the 19th International Conference on Artificial Intelligence and Statistics | en_US |
dc.relation.ispartofseries | Proceedings of Machine Learning Research; | - |
dc.rights | Final published version. Article is made available in OAR by the publisher's permission or policy. | en_US |
dc.title | An improved convergence analysis of cyclic block coordinate descent-type methods for strongly convex minimization | en_US |
dc.type | Conference Article | en_US |
pu.type.symplectic | http://www.symplectic.co.uk/publications/atom-terms/1.0/conference-proceeding | en_US |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
ImproveConvergeAnalBlockCoordDescentMin.pdf | 398.37 kB | Adobe PDF | View/Download |
Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.