To refer to this page use:
|Abstract:||Linear optimization is many times algorithmically simpler than nonlinear convex optimization. Linear optimization over matroid polytopes, matching polytopes, and path polytopes are examples of problems for which we have simple and efficient combinatorial algorithms but whose nonlinear convex counterpart is harder and admits significantly less efficient algorithms. This motivates the computational model of convex optimization, including the offline, online, and stochastic settings, using a linear optimization oracle. In this computational model we give several new results that improve on the previous state of the art. Our main result is a novel conditional gradient algorithm for smooth and strongly convex optimization over polyhedral sets that performs only a single linear optimization step over the domain on each iteration and enjoys a linear convergence rate. This gives an exponential improvement in convergence rate over previous results. Based on this new conditional gradient algorithm we give the first algorithms for online convex optimization over polyhedral sets that perform only a single linear optimization step over the domain while having optimal regret guarantees, answering an open question of Kalai and Vempala and of Hazan and Kale. Our online algorithms also imply conditional gradient algorithms for nonsmooth and stochastic convex optimization with the same convergence rates as projected (sub)gradient methods.|
|Citation:||Garber, Dan, and Elad Hazan. "A linearly convergent variant of the conditional gradient algorithm under strong convexity, with applications to online and stochastic optimization." SIAM Journal on Optimization 26, no. 3 (2016): 1493-1528. doi:10.1137/140985366|
|Pages:||1493 - 1528|
|Type of Material:||Journal Article|
|Journal/Proceeding Title:||SIAM Journal on Optimization|
Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.