Skip to main content

Accelerated mini-batch randomized block coordinate descent method

Author(s): Zhao, T; Yu, M; Wang, Y; Arora, R; Liu, H

To refer to this page use:
Abstract: We consider regularized empirical risk minimization problems. In particular, we minimize the sum of a smooth empirical risk function and a nonsmooth regularization function. When the regularization function is block separable, we can solve the minimization problems in a randomized block coordinate descent (RBCD) manner. Existing RBCD methods usually decrease the objective value by exploiting the partial gradient of a randomly selected block of coordinates in each iteration. Thus they need all data to be accessible so that the partial gradient of the block gradient can be exactly obtained. However, such a batch setting may be computationally expensive in practice. In this paper, we propose a mini-batch randomized block coordinate descent (MRBCD) method, which estimates the partial gradient of the selected block based on a mini-batch of randomly sampled data in each iteration. We further accelerate the MRBCD method by exploiting the semi-stochastic optimization scheme, which effectively reduces the variance of the partial gradient estimators. Theoretically, we show that for strongly convex functions, the MRBCD method attains lower overall iteration complexity than existing RBCD methods. As an application, we further trim the MRBCD method to solve the regularized sparse learning problems. Our numerical experiments shows that the MRBCD method naturally exploits the sparsity structure and achieves better computational performance than existing methods.
Publication Date: 2014
Citation: Zhao, Tuo, Mo Yu, Yiming Wang, Raman Arora, and Han Liu. "Accelerated mini-batch randomized block coordinate descent method." InĀ Advances in neural information processing systems, pp. 3329-3337. 2014.
ISSN: 1049-5258
Pages: 3329 - 3337
Type of Material: Conference Article
Journal/Proceeding Title: Advances in Neural Information Processing Systems 27
Version: Author's manuscript

Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.