Skip to main content

Improved Information-Theoretic Generalization Bounds for Distributed, Federated, and Iterative Learning

Author(s): Barnes, Leighton Pate; Dytso, Alex; Poor, Harold Vincent

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr1gf0mw7q
Full metadata record
DC FieldValueLanguage
dc.contributor.authorBarnes, Leighton Pate-
dc.contributor.authorDytso, Alex-
dc.contributor.authorPoor, Harold Vincent-
dc.date.accessioned2024-02-17T04:42:43Z-
dc.date.available2024-02-17T04:42:43Z-
dc.identifier.citationBarnes, Leighton Pate, Dytso, Alex, Poor, Harold Vincent. (Improved Information-Theoretic Generalization Bounds for Distributed, Federated, and Iterative Learning. Entropy, 24 (9), 1178 - 1178. doi:10.3390/e24091178en_US
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/pr1gf0mw7q-
dc.description.abstractWe consider information-theoretic bounds on the expected generalization error for statistical learning problems in a network setting. In this setting, there are K nodes, each with its own independent dataset, and the models from the K nodes have to be aggregated into a final centralized model. We consider both simple averaging of the models as well as more complicated multi-round algorithms. We give upper bounds on the expected generalization error for a variety of problems, such as those with Bregman divergence or Lipschitz continuous losses, that demonstrate an improved dependence of 1/K on the number of nodes. These “per node” bounds are in terms of the mutual information between the training dataset and the trained weights at each node and are therefore useful in describing the generalization properties inherent to having communication or privacy constraints at each node.en_US
dc.languageenen_US
dc.language.isoen_USen_US
dc.relation.ispartofEntropyen_US
dc.rightsFinal published version. This is an open access article.en_US
dc.titleImproved Information-Theoretic Generalization Bounds for Distributed, Federated, and Iterative Learningen_US
dc.typeJournal Articleen_US
dc.identifier.doidoi:10.3390/e24091178-
dc.date.eissued2022-08-24en_US
dc.identifier.eissn1099-4300-
pu.type.symplectichttp://www.symplectic.co.uk/publications/atom-terms/1.0/journal-articleen_US

Files in This Item:
File Description SizeFormat 
Improved Information-Theoretic Generalization Bounds for Distributed, Federated, and Iterative Learning.pdf1.44 MBAdobe PDFView/Download


Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.