Skip to main content

Accelerating Stochastic Composition Optimization

Author(s): Wang, Mengdi; Liu, Ji; Fang, Ethan X.

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr14r3p
Abstract: Consider the stochastic composition optimization problem where the objective is a composition of two expected-value functions. We propose a new stochastic firstorder method, namely the accelerated stochastic compositional proximal gradient (ASC-PG) method, which updates based on queries to the sampling oracle using two different timescales. The ASC-PG is the first proximal gradient method for the stochastic composition problem that can deal with nonsmooth regularization penalty. We show that the ASC-PG exhibits faster convergence than the best known algorithms, and that it achieves the optimal sample-error complexity in several important special cases. We further demonstrate the application of ASC-PG to reinforcement learning and conduct numerical experiments.
Publication Date: 1-Jan-2016
Citation: Wang, Mengdi, Ji Liu, and Ethan X. Fang. "Accelerating stochastic composition optimization." Advances in Neural Information Processing Systems (2016): 1722-1730. https://arxiv.org/abs/1607.07329v1
ISSN: 1049-5258
Pages: 1722 - 1730
Type of Material: Conference Article
Journal/Proceeding Title: Advances in Neural Information Processing Systems
Version: Author's manuscript



Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.