Skip to main content

Accelerated mini-batch randomized block coordinate descent method

Author(s): Zhao, T; Yu, M; Wang, Y; Arora, R; Liu, H

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr13w11
Full metadata record
DC FieldValueLanguage
dc.contributor.authorZhao, T-
dc.contributor.authorYu, M-
dc.contributor.authorWang, Y-
dc.contributor.authorArora, R-
dc.contributor.authorLiu, H-
dc.date.accessioned2021-10-11T14:16:49Z-
dc.date.available2021-10-11T14:16:49Z-
dc.date.issued2014en_US
dc.identifier.citationZhao, Tuo, Mo Yu, Yiming Wang, Raman Arora, and Han Liu. "Accelerated mini-batch randomized block coordinate descent method." InĀ Advances in neural information processing systems, pp. 3329-3337. 2014.en_US
dc.identifier.issn1049-5258-
dc.identifier.urihttp://papers.nips.cc/paper/5614-accelerated-mini-batch-randomized-block-coordinate-descent-method-
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/pr13w11-
dc.description.abstractWe consider regularized empirical risk minimization problems. In particular, we minimize the sum of a smooth empirical risk function and a nonsmooth regularization function. When the regularization function is block separable, we can solve the minimization problems in a randomized block coordinate descent (RBCD) manner. Existing RBCD methods usually decrease the objective value by exploiting the partial gradient of a randomly selected block of coordinates in each iteration. Thus they need all data to be accessible so that the partial gradient of the block gradient can be exactly obtained. However, such a batch setting may be computationally expensive in practice. In this paper, we propose a mini-batch randomized block coordinate descent (MRBCD) method, which estimates the partial gradient of the selected block based on a mini-batch of randomly sampled data in each iteration. We further accelerate the MRBCD method by exploiting the semi-stochastic optimization scheme, which effectively reduces the variance of the partial gradient estimators. Theoretically, we show that for strongly convex functions, the MRBCD method attains lower overall iteration complexity than existing RBCD methods. As an application, we further trim the MRBCD method to solve the regularized sparse learning problems. Our numerical experiments shows that the MRBCD method naturally exploits the sparsity structure and achieves better computational performance than existing methods.en_US
dc.format.extent3329 - 3337en_US
dc.language.isoen_USen_US
dc.relation.ispartofAdvances in Neural Information Processing Systems 27en_US
dc.rightsAuthor's manuscripten_US
dc.titleAccelerated mini-batch randomized block coordinate descent methoden_US
dc.typeConference Articleen_US
pu.type.symplectichttp://www.symplectic.co.uk/publications/atom-terms/1.0/conference-proceedingen_US

Files in This Item:
File Description SizeFormat 
AccelRandomBlockCoordDescent.pdf1.86 MBAdobe PDFView/Download


Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.