Skip to main content

Finite-sum Composition Optimization via Variance Reduced Gradient Descent

Author(s): Lian, Xiangru; Wang, Mengdi; Liu, Ji

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr1s19r
Full metadata record
DC FieldValueLanguage
dc.contributor.authorLian, Xiangru-
dc.contributor.authorWang, Mengdi-
dc.contributor.authorLiu, Ji-
dc.date.accessioned2020-02-24T22:03:10Z-
dc.date.available2020-02-24T22:03:10Z-
dc.date.issued2017-01-01en_US
dc.identifier.citationLian, X, Wang, M, Liu, J. (2017). Finite-sum Composition Optimization via Variance Reduced Gradient Descent. Proceedings of the 20th International Conference on Artificial Intelligence and Statistics, AISTATS 2017en_US
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/pr1s19r-
dc.description.abstractThe stochastic composition optimization proposed recently by Wang et al. [2014] minimizes the objective with the compositional expectation form: minx (EiFi o EjGj)(x). It summarizes many important applications in machine learning, statistics, and finance. In this paper, we consider the finite-sum scenario for composition optimization: (Formula presented.). We propose two algorithms to solve this problem by combining the stochastic compositional gradient descent (SCGD) and the stochastic variance reduced gradient (SVRG) technique. A constant linear convergence rate is proved for strongly convex optimization, which substantially improves the sublinear rate O(K−0.8) of the best known algorithm. Copyright 2017 by the author(s).en_US
dc.format.extent1 - 30en_US
dc.language.isoen_USen_US
dc.relation.ispartofProceedings of the 20th International Conference on Artificial Intelligence and Statistics, AISTATS 2017en_US
dc.rightsFinal published version. This is an open access article.en_US
dc.titleFinite-sum Composition Optimization via Variance Reduced Gradient Descenten_US
dc.typeConference Articleen_US
dc.date.eissued2017-05-20en_US
pu.type.symplectichttp://www.symplectic.co.uk/publications/atom-terms/1.0/conference-proceedingen_US

Files in This Item:
File Description SizeFormat 
OA_FiniteSumCompositionOptimizationVarianceReducedGradientDescent.pdf447.75 kBAdobe PDFView/Download


Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.