To refer to this page use:
http://arks.princeton.edu/ark:/88435/pr1jp2n
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Ye, M | - |
dc.contributor.author | Abbe, Emmanuel | - |
dc.date.accessioned | 2021-10-08T20:16:08Z | - |
dc.date.available | 2021-10-08T20:16:08Z | - |
dc.date.issued | 2018 | en_US |
dc.identifier.citation | Ye, M, Abbe, E. (2018). Communication-computation efficient gradient coding. 12 (9716p - 9716p | en_US |
dc.identifier.uri | http://arks.princeton.edu/ark:/88435/pr1jp2n | - |
dc.description.abstract | This paper develops coding techniques to reduce the running time of distributed learning tasks. It characterizes the fundamental tradeoff to compute gradients in terms of three parameters: computation load, straggler tolerance and communication cost. It further gives an explicit coding scheme that achieves the optimal tradeoff based on recursive polynomial constructions, coding both across data subsets and vector components. As a result, the proposed scheme allows to minimize the running time for gradient computations. Implementations are made on Amazon EC2 clusters using Python with mpi4py package. Results show that the proposed scheme maintains the same generalization error while reducing the running time by 32% compared to uncoded schemes and 23% compared to prior coded schemes focusing only on stragglers (Tandon et al., ICML 2017). | en_US |
dc.format.extent | 9716p - 9716p | en_US |
dc.language.iso | en_US | en_US |
dc.relation.ispartof | 35th International Conference on Machine Learning, ICML 2018 | en_US |
dc.rights | Author's manuscript | en_US |
dc.title | Communication-computation efficient gradient coding | en_US |
dc.type | Conference Article | en_US |
pu.type.symplectic | http://www.symplectic.co.uk/publications/atom-terms/1.0/conference-proceeding | en_US |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Communication-computation efficient gradient coding.pdf | 414.41 kB | Adobe PDF | View/Download |
Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.