SPDY: Accurate pruning with speedup guarantees

Frantar E, Alistarh D-A. 2022. SPDY: Accurate pruning with speedup guarantees. 39th International Conference on Machine Learning. ICML: International Conference on Machine Learning, PMLR, vol. 162, 6726–6743.

Download
OA 2022_PMLR_Frantar.pdf 615.92 KB [Published Version]
Conference Paper | Published | English

Scopus indexed

Corresponding author has ISTA affiliation

Department
Series Title
PMLR
Abstract
The recent focus on the efficiency of deep neural networks (DNNs) has led to significant work on model compression approaches, of which weight pruning is one of the most popular. At the same time, there is rapidly-growing computational support for efficiently executing the unstructured-sparse models obtained via pruning. Yet, most existing pruning methods minimize just the number of remaining weights, i.e. the size of the model, rather than optimizing for inference time. We address this gap by introducing SPDY, a new compression method which automatically determines layer-wise sparsity targets achieving a desired inference speedup on a given system, while minimizing accuracy loss. SPDY is the composition of two new techniques. The first is an efficient and general dynamic programming algorithm for solving constrained layer-wise compression problems, given a set of layer-wise error scores. The second technique is a local search procedure for automatically determining such scores in an accurate and robust manner. Experiments across popular vision and language models show that SPDY guarantees speedups while recovering higher accuracy relative to existing strategies, both for one-shot and gradual pruning scenarios, and is compatible with most existing pruning approaches. We also extend our approach to the recently-proposed task of pruning with very little data, where we achieve the best known accuracy recovery when pruning to the GPU-supported 2:4 sparsity pattern.
Publishing Year
Date Published
2022-07-20
Proceedings Title
39th International Conference on Machine Learning
Publisher
ML Research Press
Acknowledgement
We gratefully acknowledge funding from the European Research Council (ERC) under the European Union’s Horizon 2020 programme (grant agreement No 805223 ScaleML), as well as computational support from AWS EC2. We thank Eldar Kurtic for code and hyper-parameters for BERT pruning, and the Neural Magic Team, notably Michael Goin and Mark Kurtz, for support with their software.
Volume
162
Page
6726-6743
Conference
ICML: International Conference on Machine Learning
Conference Location
Baltimore, MD, United States
Conference Date
2022-07-17 – 2022-07-23
IST-REx-ID

Cite this

Frantar E, Alistarh D-A. SPDY: Accurate pruning with speedup guarantees. In: 39th International Conference on Machine Learning. Vol 162. ML Research Press; 2022:6726-6743.
Frantar, E., & Alistarh, D.-A. (2022). SPDY: Accurate pruning with speedup guarantees. In 39th International Conference on Machine Learning (Vol. 162, pp. 6726–6743). Baltimore, MD, United States: ML Research Press.
Frantar, Elias, and Dan-Adrian Alistarh. “SPDY: Accurate Pruning with Speedup Guarantees.” In 39th International Conference on Machine Learning, 162:6726–43. ML Research Press, 2022.
E. Frantar and D.-A. Alistarh, “SPDY: Accurate pruning with speedup guarantees,” in 39th International Conference on Machine Learning, Baltimore, MD, United States, 2022, vol. 162, pp. 6726–6743.
Frantar E, Alistarh D-A. 2022. SPDY: Accurate pruning with speedup guarantees. 39th International Conference on Machine Learning. ICML: International Conference on Machine Learning, PMLR, vol. 162, 6726–6743.
Frantar, Elias, and Dan-Adrian Alistarh. “SPDY: Accurate Pruning with Speedup Guarantees.” 39th International Conference on Machine Learning, vol. 162, ML Research Press, 2022, pp. 6726–43.
All files available under the following license(s):
Creative Commons Attribution 4.0 International Public License (CC-BY 4.0):
Main File(s)
File Name
Access Level
OA Open Access
Date Uploaded
2024-08-19
MD5 Checksum
5179a1e4dfc0fbfab6674907299e414a


Export

Marked Publications

Open Data ISTA Research Explorer

Web of Science

View record in Web of Science®

Search this title in

Google Scholar