Skip to main content

Shampoo: Preconditioned Stochastic Tensor Optimization

Author(s): Gupta, Vineet; Koren, Tomer; Singer, Yoram

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr1t54j
Full metadata record
DC FieldValueLanguage
dc.contributor.authorGupta, Vineet-
dc.contributor.authorKoren, Tomer-
dc.contributor.authorSinger, Yoram-
dc.date.accessioned2021-10-08T19:50:01Z-
dc.date.available2021-10-08T19:50:01Z-
dc.date.issued2018en_US
dc.identifier.citationGupta, Vineet, Tomer Koren, and Yoram Singer. "Shampoo: Preconditioned Stochastic Tensor Optimization." In Proceedings of the 35th International Conference on Machine Learning 80 (2018): pp. 1842-1850.en_US
dc.identifier.issn2640-3498-
dc.identifier.urihttp://proceedings.mlr.press/v80/gupta18a/gupta18a.pdf-
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/pr1t54j-
dc.description.abstractPreconditioned gradient methods are among the most general and powerful tools in optimization. However, preconditioning requires storing and manipulating prohibitively large matrices. We describe and analyze a new structure-aware preconditioning algorithm, called Shampoo, for stochastic optimization over tensor spaces. Shampoo maintains a set of preconditioning matrices, each of which operates on a single dimension, contracting over the remaining dimensions. We establish convergence guarantees in the stochastic convex setting, the proof of which builds upon matrix trace inequalities. Our experiments with state-of-the-art deep learning models show that Shampoo is capable of converging considerably faster than commonly used optimizers. Surprisingly, although it involves a more complex update rule, Shampoo’s runtime per step is comparable in practice to that of simple gradient methods such as SGD, AdaGrad, and Adam.en_US
dc.format.extent1842 - 1850en_US
dc.language.isoen_USen_US
dc.relation.ispartofProceedings of the 35th International Conference on Machine Learningen_US
dc.rightsFinal published version. Article is made available in OAR by the publisher's permission or policy.en_US
dc.titleShampoo: Preconditioned Stochastic Tensor Optimizationen_US
dc.typeConference Articleen_US
pu.type.symplectichttp://www.symplectic.co.uk/publications/atom-terms/1.0/conference-proceedingen_US

Files in This Item:
File Description SizeFormat 
Shampoo.pdf1.8 MBAdobe PDFView/Download


Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.