Skip to main content

Multi-Level Stochastic Gradient Methods for Nested Composition Optimization

Author(s): Yang, Shuoguang; Wang, Mengdi; Fang, Ethan X.

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr18185
Abstract: Stochastic gradient methods are scalable for solving large-scale optimization problems that involve empirical expectations of loss functions. Existing results mainly apply to optimization problems where the objectives are one- or two-level expectations. In this paper, we consider the multilevel composition optimization problem that involves compositions of multilevel component functions and nested expectations over a random path. This finds applications in risk-averse optimization and sequential planning. We propose a class of multilevel stochastic gradient methods that are motivated by the method of multitimescale stochastic approximation. First, we propose a basic T -level stochastic compositional gradient algorithm. Then we develop accelerated multilevel stochastic gradient methods by using an extrapolation-interpolation scheme to take advantage of the smoothness of individual component functions. When all component functions are smooth, we show that the convergence rate improves to O(n-4=(7+T )) for general objectives and O(n-4=(3+T )) for strongly convex objectives. We also provide almost sure convergence and rate of convergence results for nonconvex problems. The proposed methods and theoretical results are validated using numerical experiments.
Publication Date: 1-Jan-2019
Citation: Yang, S, Wang, M, Fang, EX. (2019). Multi-Level Stochastic Gradient Methods for Nested Composition Optimization. SIAM Journal on Optimization, 29 (1), 616 - 659. doi:10.1137/18M1164846
DOI: doi:10.1137/18M1164846
ISSN: 1052-6234
Pages: 616 - 659
Type of Material: Journal Article
Journal/Proceeding Title: SIAM Journal on Optimization
Version: Author's manuscript



Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.