The convergence of stochastic gradient descent in asynchronous shared memory
Alistarh D-A, De Sa C, Konstantinov NH. 2018. The convergence of stochastic gradient descent in asynchronous shared memory. Proceedings of the 2018 ACM Symposium on Principles of Distributed Computing - PODC ’18. PODC: Principles of Distributed Computing, 169–178.
Download (ext.)
https://arxiv.org/abs/1803.08841
[Preprint]
Conference Paper
| Published
| English
Scopus indexed
Author
Department
Abstract
Stochastic Gradient Descent (SGD) is a fundamental algorithm in machine learning, representing the optimization backbone for training several classic models, from regression to neural networks. Given the recent practical focus on distributed machine learning, significant work has been dedicated to the convergence properties of this algorithm under the inconsistent and noisy updates arising from execution in a distributed environment. However, surprisingly, the convergence properties of this classic algorithm in the standard shared-memory model are still not well-understood. In this work, we address this gap, and provide new convergence bounds for lock-free concurrent stochastic gradient descent, executing in the classic asynchronous shared memory model, against a strong adaptive adversary. Our results give improved upper and lower bounds on the "price of asynchrony'' when executing the fundamental SGD algorithm in a concurrent setting. They show that this classic optimization tool can converge faster and with a wider range of parameters than previously known under asynchronous iterations. At the same time, we exhibit a fundamental trade-off between the maximum delay in the system and the rate at which SGD can converge, which governs the set of parameters under which this algorithm can still work efficiently.
Publishing Year
Date Published
2018-07-23
Proceedings Title
Proceedings of the 2018 ACM Symposium on Principles of Distributed Computing - PODC '18
Publisher
ACM Press
Page
169-178
Conference
PODC: Principles of Distributed Computing
Conference Location
Egham, United Kingdom
Conference Date
2018-07-23 – 2018-07-27
ISBN
IST-REx-ID
Cite this
Alistarh D-A, De Sa C, Konstantinov NH. The convergence of stochastic gradient descent in asynchronous shared memory. In: Proceedings of the 2018 ACM Symposium on Principles of Distributed Computing - PODC ’18. ACM Press; 2018:169-178. doi:10.1145/3212734.3212763
Alistarh, D.-A., De Sa, C., & Konstantinov, N. H. (2018). The convergence of stochastic gradient descent in asynchronous shared memory. In Proceedings of the 2018 ACM Symposium on Principles of Distributed Computing - PODC ’18 (pp. 169–178). Egham, United Kingdom: ACM Press. https://doi.org/10.1145/3212734.3212763
Alistarh, Dan-Adrian, Christopher De Sa, and Nikola H Konstantinov. “The Convergence of Stochastic Gradient Descent in Asynchronous Shared Memory.” In Proceedings of the 2018 ACM Symposium on Principles of Distributed Computing - PODC ’18, 169–78. ACM Press, 2018. https://doi.org/10.1145/3212734.3212763.
D.-A. Alistarh, C. De Sa, and N. H. Konstantinov, “The convergence of stochastic gradient descent in asynchronous shared memory,” in Proceedings of the 2018 ACM Symposium on Principles of Distributed Computing - PODC ’18, Egham, United Kingdom, 2018, pp. 169–178.
Alistarh D-A, De Sa C, Konstantinov NH. 2018. The convergence of stochastic gradient descent in asynchronous shared memory. Proceedings of the 2018 ACM Symposium on Principles of Distributed Computing - PODC ’18. PODC: Principles of Distributed Computing, 169–178.
Alistarh, Dan-Adrian, et al. “The Convergence of Stochastic Gradient Descent in Asynchronous Shared Memory.” Proceedings of the 2018 ACM Symposium on Principles of Distributed Computing - PODC ’18, ACM Press, 2018, pp. 169–78, doi:10.1145/3212734.3212763.
All files available under the following license(s):
Copyright Statement:
This Item is protected by copyright and/or related rights. [...]
Link(s) to Main File(s)
Access Level
Open Access
Export
Marked PublicationsOpen Data ISTA Research Explorer
Web of Science
View record in Web of Science®Sources
arXiv 1803.08841