Skip to main content

Non-vacuous Generalization Bounds at the ImageNet Scale: a PAC-Bayesian Compression Approach

Author(s): Zhou, Wenda; Veitch, Victor; Austern, Morgane; Adams, Ryan P; Orbanz, Peter

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr1vn7f
Full metadata record
DC FieldValueLanguage
dc.contributor.authorZhou, Wenda-
dc.contributor.authorVeitch, Victor-
dc.contributor.authorAustern, Morgane-
dc.contributor.authorAdams, Ryan P-
dc.contributor.authorOrbanz, Peter-
dc.date.accessioned2021-10-08T19:45:43Z-
dc.date.available2021-10-08T19:45:43Z-
dc.date.issued2019en_US
dc.identifier.citationZhou, Wenda, Victor Veitch, Morgane Austern, Ryan P. Adams, and Peter Orbanz. "Non-vacuous Generalization Bounds at the ImageNet Scale: a PAC-Bayesian Compression Approach." International Conference on Learning Representations (2019).en_US
dc.identifier.urihttps://iclr.cc/Conferences/2019/Schedule?showEvent=806-
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/pr1vn7f-
dc.description.abstractModern neural networks are highly overparameterized, with capacity to substantially overfit to training data. Nevertheless, these networks often generalize well in practice. It has also been observed that trained networks can often be compressed to much smaller representations. The purpose of this paper is to connect these two empirical observations. Our main technical result is a generalization bound for compressed networks based on the compressed size that, combined with off-the-shelf compression algorithms, leads to state-of-the-art generalization guarantees. In particular, we provide the first non-vacuous generalization guarantees for realistic architectures applied to the ImageNet classification problem. Additionally, we show that compressibility of models that tend to overfit is limited. Empirical results show that an increase in overfitting increases the number of bits required to describe a trained network.en_US
dc.language.isoen_USen_US
dc.relation.ispartofInternational Conference on Learning Representations (ICLR)en_US
dc.rightsFinal published version. Article is made available in OAR by the publisher's permission or policy.en_US
dc.titleNon-vacuous Generalization Bounds at the ImageNet Scale: a PAC-Bayesian Compression Approachen_US
dc.typeConference Articleen_US
pu.type.symplectichttp://www.symplectic.co.uk/publications/atom-terms/1.0/conference-proceedingen_US

Files in This Item:
File Description SizeFormat 
NonvacuousBoundsAtImageNetScale.pdf295.74 kBAdobe PDFView/Download


Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.