UVeQFed: Universal Vector Quantization for Federated Learning
Author(s): Shlezinger, Nir; Chen, Mingzhe; Eldar, Yonina C; Poor, H Vincent; Cui, Shuguang
DownloadTo refer to this page use:
http://arks.princeton.edu/ark:/88435/pr1m902337
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Shlezinger, Nir | - |
dc.contributor.author | Chen, Mingzhe | - |
dc.contributor.author | Eldar, Yonina C | - |
dc.contributor.author | Poor, H Vincent | - |
dc.contributor.author | Cui, Shuguang | - |
dc.date.accessioned | 2024-02-04T02:10:38Z | - |
dc.date.available | 2024-02-04T02:10:38Z | - |
dc.date.issued | 2020-12-23 | en_US |
dc.identifier.citation | Shlezinger, Nir, Chen, Mingzhe, Eldar, Yonina C, Poor, H Vincent, Cui, Shuguang. (2021). UVeQFed: Universal Vector Quantization for Federated Learning. IEEE Transactions on Signal Processing, 69 (500 - 514. doi:10.1109/tsp.2020.3046971 | en_US |
dc.identifier.issn | 1053-587X | - |
dc.identifier.uri | http://arks.princeton.edu/ark:/88435/pr1m902337 | - |
dc.description.abstract | Traditional deep learning models are trained at a centralized server using data samples collected from users. Such data samples often include private information, which the users may not be willing to share. Federated learning (FL) is an emerging approach to train such learning models without requiring the users to share their data. FL consists of an iterative procedure, where in each iteration the users train a copy of the learning model locally. The server then collects the individual updates and aggregates them into a global model. A major challenge that arises in this method is the need of each user to repeatedly transmit its learned model over the throughput limited uplink channel. In this work, we tackle this challenge using tools from quantization theory. In particular, we identify the unique characteristics associated with conveying trained models over rate-constrained channels, and propose a suitable quantization scheme for such settings, referred to as universal vector quantization for FL (UVeQFed). We show that combining universal vector quantization methods with FL yields a decentralized training system in which the compression of the trained models induces only a minimum distortion. We then theoretically analyze the distortion, showing that it vanishes as the number of users grows. We also characterize how models trained with conventional federated averaging combined with UVeQFed converge to the model which minimizes the loss function. Our numerical results demonstrate the gains of UVeQFed over previously proposed methods in terms of both distortion induced in quantization and accuracy of the resulting aggregated model. | en_US |
dc.format.extent | 500 - 514 | en_US |
dc.language.iso | en_US | en_US |
dc.relation.ispartof | IEEE Transactions on Signal Processing | en_US |
dc.rights | Author's manuscript | en_US |
dc.title | UVeQFed: Universal Vector Quantization for Federated Learning | en_US |
dc.type | Journal Article | en_US |
dc.identifier.doi | doi:10.1109/tsp.2020.3046971 | - |
dc.identifier.eissn | 1941-0476 | - |
pu.type.symplectic | http://www.symplectic.co.uk/publications/atom-terms/1.0/journal-article | en_US |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
UVeQFedUniversalVectorQuantization.pdf | 958.53 kB | Adobe PDF | View/Download |
Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.