Scalable belief propagation via relaxed scheduling
Aksenov V, Alistarh D-A, Korhonen J. 2020. Scalable belief propagation via relaxed scheduling. NeurIPS: Conference on Neural Information Processing Systems, Advances in Neural Information Processing Systems, vol. 33, 22361–22372.
Download (ext.)
Conference Paper
| Published
| English
Scopus indexed
Author
Corresponding author has ISTA affiliation
Department
Series Title
Advances in Neural Information Processing Systems
Abstract
The ability to leverage large-scale hardware parallelism has been one of the key enablers of the accelerated recent progress in machine learning. Consequently, there has been considerable effort invested into developing efficient parallel variants of classic machine learning algorithms. However, despite the wealth of knowledge on parallelization, some classic machine learning algorithms often prove hard to parallelize efficiently while maintaining convergence. In this paper, we focus on efficient parallel algorithms for the key machine learning task of inference on graphical models, in particular on the fundamental belief propagation algorithm. We address the challenge of efficiently parallelizing this classic paradigm by showing how to leverage scalable relaxed schedulers in this context. We present an extensive empirical study, showing that our approach outperforms previous parallel belief propagation implementations both in terms of scalability and in terms of wall-clock convergence time, on a range of practical applications.
Publishing Year
Date Published
2020-12-06
Publisher
Neural Information Processing Systems Foundation
Acknowledgement
We thank Marco Mondelli for discussions related to LDPC decoding, and Giorgi Nadiradze for discussions on analysis of relaxed schedulers. This project has received funding from the European Research Council (ERC) under the European
Union’s Horizon 2020 research and innovation programme (grant agreement No 805223 ScaleML).
Volume
33
Page
22361-22372
Conference
NeurIPS: Conference on Neural Information Processing Systems
Conference Location
Vancouver, Canada
Conference Date
2020-12-06 – 2020-12-12
ISBN
ISSN
IST-REx-ID
Cite this
Aksenov V, Alistarh D-A, Korhonen J. Scalable belief propagation via relaxed scheduling. In: Vol 33. Neural Information Processing Systems Foundation; 2020:22361-22372.
Aksenov, V., Alistarh, D.-A., & Korhonen, J. (2020). Scalable belief propagation via relaxed scheduling (Vol. 33, pp. 22361–22372). Presented at the NeurIPS: Conference on Neural Information Processing Systems, Vancouver, Canada: Neural Information Processing Systems Foundation.
Aksenov, Vitaly, Dan-Adrian Alistarh, and Janne Korhonen. “Scalable Belief Propagation via Relaxed Scheduling,” 33:22361–72. Neural Information Processing Systems Foundation, 2020.
V. Aksenov, D.-A. Alistarh, and J. Korhonen, “Scalable belief propagation via relaxed scheduling,” presented at the NeurIPS: Conference on Neural Information Processing Systems, Vancouver, Canada, 2020, vol. 33, pp. 22361–22372.
Aksenov V, Alistarh D-A, Korhonen J. 2020. Scalable belief propagation via relaxed scheduling. NeurIPS: Conference on Neural Information Processing Systems, Advances in Neural Information Processing Systems, vol. 33, 22361–22372.
Aksenov, Vitaly, et al. Scalable Belief Propagation via Relaxed Scheduling. Vol. 33, Neural Information Processing Systems Foundation, 2020, pp. 22361–72.
All files available under the following license(s):
Copyright Statement:
This Item is protected by copyright and/or related rights. [...]
Link(s) to Main File(s)
Access Level

Export
Marked PublicationsOpen Data ISTA Research Explorer
Sources
arXiv 2002.11505