Optimal brain compression: A framework for accurate post-training quantization and pruning

Frantar E, Singh SP, Alistarh D-A. 2022. Optimal brain compression: A framework for accurate post-training quantization and pruning. 36th Conference on Neural Information Processing Systems. NeurIPS: Neural Information Processing Systems, NeurIPS, vol. 35.

Download
OA 2022_NeurIPS_Frantar.pdf 491.84 KB [Submitted Version]
Conference Paper | Published | English

Scopus indexed

Corresponding author has ISTA affiliation

Department
Series Title
NeurIPS
Abstract
We consider the problem of model compression for deep neural networks (DNNs) in the challenging one-shot/post-training setting, in which we are given an accurate trained model, and must compress it without any retraining, based only on a small amount of calibration input data. This problem has become popular in view of the emerging software and hardware support for executing models compressed via pruning and/or quantization with speedup, and well-performing solutions have been proposed independently for both compression approaches.In this paper, we introduce a new compression framework which covers both weight pruning and quantization in a unified setting, is time- and space-efficient, and considerably improves upon the practical performance of existing post-training methods. At the technical level, our approach is based on an exact and efficient realization of the classical Optimal Brain Surgeon (OBS) framework of [LeCun, Denker, and Solla, 1990] extended to also cover weight quantization at the scale of modern DNNs. From the practical perspective, our experimental results show that it can improve significantly upon the compression-accuracy trade-offs of existing post-training methods, and that it can enable the accurate compound application of both pruning and quantization in a post-training setting.
Publishing Year
Date Published
2022-12-01
Proceedings Title
36th Conference on Neural Information Processing Systems
Publisher
ML Research Press
Acknowledgement
We gratefully acknowledge funding from the European Research Council (ERC) under the European Union’s Horizon 2020 programme (grant agreement No 805223 ScaleML), as well as computational support from AWS EC2. We thank Eldar Kurtic for providing us BERT code and pretrained models, and the Neural Magic Team, notably Michael Goin and Mark Kurtz, for support with their software.
Volume
35
Conference
NeurIPS: Neural Information Processing Systems
Conference Location
New Orleans, LA, United States
Conference Date
2022-11-28 – 2022-12-09
IST-REx-ID

Cite this

Frantar E, Singh SP, Alistarh D-A. Optimal brain compression: A framework for accurate post-training quantization and pruning. In: 36th Conference on Neural Information Processing Systems. Vol 35. ML Research Press; 2022.
Frantar, E., Singh, S. P., & Alistarh, D.-A. (2022). Optimal brain compression: A framework for accurate post-training quantization and pruning. In 36th Conference on Neural Information Processing Systems (Vol. 35). New Orleans, LA, United States: ML Research Press.
Frantar, Elias, Sidak Pal Singh, and Dan-Adrian Alistarh. “Optimal Brain Compression: A Framework for Accurate Post-Training Quantization and Pruning.” In 36th Conference on Neural Information Processing Systems, Vol. 35. ML Research Press, 2022.
E. Frantar, S. P. Singh, and D.-A. Alistarh, “Optimal brain compression: A framework for accurate post-training quantization and pruning,” in 36th Conference on Neural Information Processing Systems, New Orleans, LA, United States, 2022, vol. 35.
Frantar E, Singh SP, Alistarh D-A. 2022. Optimal brain compression: A framework for accurate post-training quantization and pruning. 36th Conference on Neural Information Processing Systems. NeurIPS: Neural Information Processing Systems, NeurIPS, vol. 35.
Frantar, Elias, et al. “Optimal Brain Compression: A Framework for Accurate Post-Training Quantization and Pruning.” 36th Conference on Neural Information Processing Systems, vol. 35, ML Research Press, 2022.
All files available under the following license(s):
Copyright Statement:
This Item is protected by copyright and/or related rights. [...]
Main File(s)
File Name
Access Level
OA Open Access
Date Uploaded
2024-08-05
MD5 Checksum
38e7d75f578e8d2e207c81895e09f211


Material in ISTA:
Dissertation containing ISTA record

Export

Marked Publications

Open Data ISTA Research Explorer

Sources

arXiv 2208.11580

Search this title in

Google Scholar
ISBN Search