Loss aware post-training quantization

Nahshan Y, Chmiel B, Baskin C, Zheltonozhskii E, Banner R, Bronstein AM, Mendelson A. 2021. Loss aware post-training quantization. Machine Learning. 110(11–12), 3245–3262.

Download (ext.)

Journal Article | Published | English

Scopus indexed
Author
Nahshan, Yury; Chmiel, Brian; Baskin, Chaim; Zheltonozhskii, Evgenii; Banner, Ron; Bronstein, Alex M.ISTA ; Mendelson, Avi
Abstract
Neural network quantization enables the deployment of large models on resource-constrained devices. Current post-training quantization methods fall short in terms of accuracy for INT4 (or lower) but provide reasonable accuracy for INT8 (or above). In this work, we study the effect of quantization on the structure of the loss landscape. We show that the structure is flat and separable for mild quantization, enabling straightforward post-training quantization methods to achieve good results. We show that with more aggressive quantization, the loss landscape becomes highly non-separable with steep curvature, making the selection of quantization parameters more challenging. Armed with this understanding, we design a method that quantizes the layer parameters jointly, enabling significant accuracy improvement over current post-training quantization methods. Reference implementation is available at https://github.com/ynahshan/nn-quantization-pytorch/tree/master/lapq.
Publishing Year
Date Published
2021-12-01
Journal Title
Machine Learning
Publisher
Springer Nature
Volume
110
Issue
11-12
Page
3245-3262
ISSN
eISSN
IST-REx-ID

Cite this

Nahshan Y, Chmiel B, Baskin C, et al. Loss aware post-training quantization. Machine Learning. 2021;110(11-12):3245-3262. doi:10.1007/s10994-021-06053-z
Nahshan, Y., Chmiel, B., Baskin, C., Zheltonozhskii, E., Banner, R., Bronstein, A. M., & Mendelson, A. (2021). Loss aware post-training quantization. Machine Learning. Springer Nature. https://doi.org/10.1007/s10994-021-06053-z
Nahshan, Yury, Brian Chmiel, Chaim Baskin, Evgenii Zheltonozhskii, Ron Banner, Alex M. Bronstein, and Avi Mendelson. “Loss Aware Post-Training Quantization.” Machine Learning. Springer Nature, 2021. https://doi.org/10.1007/s10994-021-06053-z.
Y. Nahshan et al., “Loss aware post-training quantization,” Machine Learning, vol. 110, no. 11–12. Springer Nature, pp. 3245–3262, 2021.
Nahshan Y, Chmiel B, Baskin C, Zheltonozhskii E, Banner R, Bronstein AM, Mendelson A. 2021. Loss aware post-training quantization. Machine Learning. 110(11–12), 3245–3262.
Nahshan, Yury, et al. “Loss Aware Post-Training Quantization.” Machine Learning, vol. 110, no. 11–12, Springer Nature, 2021, pp. 3245–62, doi:10.1007/s10994-021-06053-z.
All files available under the following license(s):
Copyright Statement:
This Item is protected by copyright and/or related rights. [...]

Link(s) to Main File(s)
Access Level
OA Open Access

Export

Marked Publications

Open Data ISTA Research Explorer

Sources

arXiv 1911.07190

Search this title in

Google Scholar