Skip to main content

Second-order stochastic optimization for machine learning in linear time

Author(s): Agarwal, N; Bullins, B; Hazan, Elad

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr13x1d
Full metadata record
DC FieldValueLanguage
dc.contributor.authorAgarwal, N-
dc.contributor.authorBullins, B-
dc.contributor.authorHazan, Elad-
dc.date.accessioned2018-07-20T15:08:41Z-
dc.date.available2018-07-20T15:08:41Z-
dc.date.issued2017-11-01en_US
dc.identifier.citationAgarwal, N, Bullins, B, Hazan, E. (2017). Second-order stochastic optimization for machine learning in linear time. Journal of Machine Learning Research, 18 (1 - 40en_US
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/pr13x1d-
dc.description.abstractFirst-order stochastic methods are the state-of-the-art in large-scale machine learning optimization owing to efficient per-iteration complexity. Second-order methods, while able to provide faster convergence, have been much less explored due to the high cost of computing the second-order information. In this paper we develop second-order stochastic methods for optimization problems in machine learning that match the per-iteration cost of gradient based methods, and in certain settings improve upon the overall running time over popular first-order methods. Furthermore, our algorithm has the desirable property of being implementable in time linear in the sparsity of the input dataen_US
dc.format.extent1 - 40en_US
dc.language.isoen_USen_US
dc.relation.ispartofJournal of Machine Learning Researchen_US
dc.rightsAuthor's manuscripten_US
dc.titleSecond-order stochastic optimization for machine learning in linear timeen_US
dc.typeJournal Articleen_US
dc.date.eissued2017-11-01en_US
pu.type.symplectichttp://www.symplectic.co.uk/publications/atom-terms/1.0/journal-articleen_US

Files in This Item:
File Description SizeFormat 
Second-order stochastic optimization for machine learning in linear time.pdf1.01 MBAdobe PDFView/Download


Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.