To refer to this page use:
|Abstract:||We consider the stochastic nested composition optimization problem where the objective is a composition of two expected-value functions. We propose a new stochastic first-order method, namely the accelerated stochastic compositional proximal gradient (ASC-PG) method. This algorithm updates the solution based on noisy gradient queries using a two-timescale iteration. The ASC-PG is the first proximal gradient method for the stochastic composition problem that can deal with nonsmooth regularization penalty. We show that the ASC-PG exhibits faster convergence than the best known algorithms, and that it achieves the optimal sample-error complexity in several important special cases. We demonstrate the application of ASC-PG to reinforcement learning and conduct numerical experiments.|
|Citation:||Wang, Mengdi, Ji Liu, and Ethan X. Fang. "Accelerating stochastic composition optimization." The Journal of Machine Learning Research 18, no. 1 (2017): 3721-3743. http://www.jmlr.org/papers/volume18/16-504/16-504.pdf|
|Pages:||3721 - 3743|
|Type of Material:||Journal Article|
|Journal/Proceeding Title:||Journal of Machine Learning Research|
|Version:||Final published version. This is an open access article.|
Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.