Skip to main content

Chaining mutual information and tightening generalization bounds

Author(s): Asadi, AR; Abbe, Emmanuel; Verdú, S

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr1pc2m
Full metadata record
DC FieldValueLanguage
dc.contributor.authorAsadi, AR-
dc.contributor.authorAbbe, Emmanuel-
dc.contributor.authorVerdú, S-
dc.date.accessioned2021-10-08T20:16:08Z-
dc.date.available2021-10-08T20:16:08Z-
dc.date.issued2018en_US
dc.identifier.citationAsadi, AR, Abbe, E, Verdú, S. (2018). Chaining mutual information and tightening generalization bounds. 2018-December (7234 - 7243en_US
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/pr1pc2m-
dc.description.abstractBounding the generalization error of learning algorithms has a long history, which yet falls short in explaining various generalization successes including those of deep learning. Two important difficulties are (i) exploiting the dependencies between the hypotheses, (ii) exploiting the dependence between the algorithm's input and output. Progress on the first point was made with the chaining method, originating from the work of Kolmogorov, and used in the VC-dimension bound. More recently, progress on the second point was made with the mutual information method by Russo and Zou'15. Yet, these two methods are currently disjoint. In this paper, we introduce a technique to combine the chaining and mutual information methods, to obtain a generalization bound that is both algorithm-dependent and that exploits the dependencies between the hypotheses. We provide an example in which our bound significantly outperforms both the chaining and the mutual information bounds. As a corollary, we tighten Dudley's inequality when the learning algorithm chooses its output from a small subset of hypotheses with high probabilityen_US
dc.format.extent7234 - 7243en_US
dc.language.isoen_USen_US
dc.relation.ispartofAdvances in Neural Information Processing Systemsen_US
dc.rightsAuthor's manuscripten_US
dc.titleChaining mutual information and tightening generalization boundsen_US
dc.typeConference Articleen_US
pu.type.symplectichttp://www.symplectic.co.uk/publications/atom-terms/1.0/conference-proceedingen_US

Files in This Item:
File Description SizeFormat 
Chaining mutual information and tightening generalization bounds.pdf469.04 kBAdobe PDFView/Download


Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.