Skip to main content

Beyond convexity: Stochastic quasi-convex optimization

Author(s): Hazan, Elad; Levy, KY; Shalev-Shwartz, S

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr12x0n
Full metadata record
DC FieldValueLanguage
dc.contributor.authorHazan, Elad-
dc.contributor.authorLevy, KY-
dc.contributor.authorShalev-Shwartz, S-
dc.date.accessioned2018-07-20T15:08:44Z-
dc.date.available2018-07-20T15:08:44Z-
dc.date.issued2015en_US
dc.identifier.citationHazan, E, Levy, KY, Shalev-Shwartz, S. (2015). Beyond convexity: Stochastic quasi-convex optimization. 2015-January (1594 - 1602en_US
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/pr12x0n-
dc.description.abstractStochastic convex optimization is a basic and well studied primitive in machine learning. It is well known that convex and Lipschitz functions can be minimized efficiently using Stochastic Gradient Descent (SGD). The Normalized Gradient Descent (NGD) algorithm, is an adaptation of Gradient Descent, which updates according to the direction of the gradients, rather than the gradients themselves. In this paper we analyze a stochastic version of NGD and prove its convergence to a global minimum for a wider class of functions: we require the functions to be quasi-convex and locally-Lipschitz. Quasi-convexity broadens the concept of unimodality to multidimensions and allows for certain types of saddle points, which are a known hurdle for first-order optimization methods such as gradient descent. Locally-Lipschitz functions are only required to be Lipschitz in a small region around the optimum. This assumption circumvents gradient explosion, which is another known hurdle for gradient descent variants. Interestingly, unlike the vanilla SGD algorithm, the stochastic normalized gradient descent algorithm provably requires a minimal minibatch size.en_US
dc.format.extent1594 - 1602en_US
dc.language.isoen_USen_US
dc.relation.ispartofAdvances in Neural Information Processing Systemsen_US
dc.rightsAuthor's manuscripten_US
dc.titleBeyond convexity: Stochastic quasi-convex optimizationen_US
dc.typeConference Articleen_US
dc.date.eissued2015en_US
pu.type.symplectichttp://www.symplectic.co.uk/publications/atom-terms/1.0/conference-proceedingen_US

Files in This Item:
File Description SizeFormat 
Beyond convexity Stochastic quasiconvex optimization.pdf470.64 kBAdobe PDFView/Download


Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.