QSGD: Communication-efficient SGD via gradient quantization and encoding

Alistarh D-A, Grubic D, Li J, Tomioka R, Vojnović M. 2017. QSGD: Communication-efficient SGD via gradient quantization and encoding. NIPS: Neural Information Processing System, Advances in Neural Information Processing Systems, vol. 2017, 1710–1721.

Download (ext.)
OA https://arxiv.org/abs/1610.02132 [Submitted Version]
Conference Paper | Published | English
Author
Alistarh, Dan-AdrianISTA ; Grubic, Demjan; Li, Jerry; Tomioka, Ryota; Vojnović, Milan

Corresponding author has ISTA affiliation

Department
Series Title
Advances in Neural Information Processing Systems
Abstract
Parallel implementations of stochastic gradient descent (SGD) have received significant research attention, thanks to its excellent scalability properties. A fundamental barrier when parallelizing SGD is the high bandwidth cost of communicating gradient updates between nodes; consequently, several lossy compresion heuristics have been proposed, by which nodes only communicate quantized gradients. Although effective in practice, these heuristics do not always converge. In this paper, we propose Quantized SGD (QSGD), a family of compression schemes with convergence guarantees and good practical performance. QSGD allows the user to smoothly trade off communication bandwidth and convergence time: nodes can adjust the number of bits sent per iteration, at the cost of possibly higher variance. We show that this trade-off is inherent, in the sense that improving it past some threshold would violate information-theoretic lower bounds. QSGD guarantees convergence for convex and non-convex objectives, under asynchrony, and can be extended to stochastic variance-reduced techniques. When applied to training deep neural networks for image classification and automated speech recognition, QSGD leads to significant reductions in end-to-end training time. For instance, on 16GPUs, we can train the ResNet-152 network to full accuracy on ImageNet 1.8 × faster than the full-precision variant.
Publishing Year
Date Published
2017-01-01
Publisher
Neural Information Processing Systems Foundation
Volume
2017
Page
1710-1721
Conference
NIPS: Neural Information Processing System
Conference Location
Long Beach, CA, United States
Conference Date
2017-12-04 – 2017-12-09
ISSN
IST-REx-ID
431

Cite this

Alistarh D-A, Grubic D, Li J, Tomioka R, Vojnović M. QSGD: Communication-efficient SGD via gradient quantization and encoding. In: Vol 2017. Neural Information Processing Systems Foundation; 2017:1710-1721.
Alistarh, D.-A., Grubic, D., Li, J., Tomioka, R., & Vojnović, M. (2017). QSGD: Communication-efficient SGD via gradient quantization and encoding (Vol. 2017, pp. 1710–1721). Presented at the NIPS: Neural Information Processing System, Long Beach, CA, United States: Neural Information Processing Systems Foundation.
Alistarh, Dan-Adrian, Demjan Grubic, Jerry Li, Ryota Tomioka, and Milan Vojnović. “QSGD: Communication-Efficient SGD via Gradient Quantization and Encoding,” 2017:1710–21. Neural Information Processing Systems Foundation, 2017.
D.-A. Alistarh, D. Grubic, J. Li, R. Tomioka, and M. Vojnović, “QSGD: Communication-efficient SGD via gradient quantization and encoding,” presented at the NIPS: Neural Information Processing System, Long Beach, CA, United States, 2017, vol. 2017, pp. 1710–1721.
Alistarh D-A, Grubic D, Li J, Tomioka R, Vojnović M. 2017. QSGD: Communication-efficient SGD via gradient quantization and encoding. NIPS: Neural Information Processing System, Advances in Neural Information Processing Systems, vol. 2017, 1710–1721.
Alistarh, Dan-Adrian, et al. QSGD: Communication-Efficient SGD via Gradient Quantization and Encoding. Vol. 2017, Neural Information Processing Systems Foundation, 2017, pp. 1710–21.
All files available under the following license(s):
Copyright Statement:
This Item is protected by copyright and/or related rights. [...]

Link(s) to Main File(s)
Access Level
OA Open Access

Export

Marked Publications

Open Data ISTA Research Explorer

Sources

arXiv 1610.02132

Search this title in

Google Scholar