Finite-sum Composition Optimization via Variance Reduced Gradient Descent
Author(s): Lian, Xiangru; Wang, Mengdi; Liu, Ji
DownloadTo refer to this page use:
http://arks.princeton.edu/ark:/88435/pr1s19r
Abstract: | The stochastic composition optimization proposed recently by Wang et al. [2014] minimizes the objective with the compositional expectation form: minx (EiFi o EjGj)(x). It summarizes many important applications in machine learning, statistics, and finance. In this paper, we consider the finite-sum scenario for composition optimization: (Formula presented.). We propose two algorithms to solve this problem by combining the stochastic compositional gradient descent (SCGD) and the stochastic variance reduced gradient (SVRG) technique. A constant linear convergence rate is proved for strongly convex optimization, which substantially improves the sublinear rate O(K−0.8) of the best known algorithm. Copyright 2017 by the author(s). |
Publication Date: | 1-Jan-2017 |
Electronic Publication Date: | 20-May-2017 |
Citation: | Lian, X, Wang, M, Liu, J. (2017). Finite-sum Composition Optimization via Variance Reduced Gradient Descent. Proceedings of the 20th International Conference on Artificial Intelligence and Statistics, AISTATS 2017 |
Pages: | 1 - 30 |
Type of Material: | Conference Article |
Journal/Proceeding Title: | Proceedings of the 20th International Conference on Artificial Intelligence and Statistics, AISTATS 2017 |
Version: | Final published version. This is an open access article. |
Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.