CrAM: A Compression-Aware Minimizer
Krumes A, Vladu A, Kurtic E, Lampert C, Alistarh D-A. 2023. CrAM: A Compression-Aware Minimizer. 11th International Conference on Learning Representations . ICLR: International Conference on Learning Representations.
Download
Download (ext.)
https://openreview.net/pdf?id=_eTZBs-yedr
[Published Version]
Conference Paper
| Published
| English
Author
Corresponding author has ISTA affiliation
Department
Abstract
Deep neural networks (DNNs) often have to be compressed, via pruning and/or quantization, before they can be deployed in practical settings. In this work we propose a new compression-aware minimizer dubbed CrAM that modifies the optimization step in a principled way, in order to produce models whose local loss behavior is stable under compression operations such as pruning. Thus, dense models trained via CrAM should be compressible post-training, in a single step, without significant accuracy loss. Experimental results on standard benchmarks, such as residual networks for ImageNet classification and BERT models for language modelling, show that CrAM produces dense models that can be more accurate than the standard SGD/Adam-based baselines, but which are stable under weight pruning: specifically, we can prune models in one-shot to 70-80% sparsity with almost no accuracy loss, and to 90% with reasonable (∼1%) accuracy loss, which is competitive with gradual compression methods. Additionally, CrAM can produce sparse models which perform well for transfer learning, and it also works for semi-structured 2:4 pruning patterns supported by GPU hardware. The code for reproducing the results is available at this https URL .
Publishing Year
Date Published
2023-05-01
Proceedings Title
11th International Conference on Learning Representations
Publisher
OpenReview
Acknowledgement
AP, EK, DA received funding from the European Research Council (ERC) under the European
Union’s Horizon 2020 research and innovation programme (grant agreement No 805223 ScaleML). AV acknowledges the support of the French Agence Nationale de la Recherche (ANR), under grant ANR-21-CE48-0016 (project COMCOPT). We further acknowledge the support from the Scientific Service Units (SSU) of ISTA through resources provided by Scientific Computing (SciComp).
Acknowledged SSUs
Conference
ICLR: International Conference on Learning Representations
Conference Location
Kigali, Rwanda
Conference Date
2023-05-01 – 2023-05-05
IST-REx-ID
Cite this
Krumes A, Vladu A, Kurtic E, Lampert C, Alistarh D-A. CrAM: A Compression-Aware Minimizer. In: 11th International Conference on Learning Representations . OpenReview; 2023.
Krumes, A., Vladu, A., Kurtic, E., Lampert, C., & Alistarh, D.-A. (2023). CrAM: A Compression-Aware Minimizer. In 11th International Conference on Learning Representations . Kigali, Rwanda : OpenReview.
Krumes, Alexandra, Adrian Vladu, Eldar Kurtic, Christoph Lampert, and Dan-Adrian Alistarh. “CrAM: A Compression-Aware Minimizer.” In 11th International Conference on Learning Representations . OpenReview, 2023.
A. Krumes, A. Vladu, E. Kurtic, C. Lampert, and D.-A. Alistarh, “CrAM: A Compression-Aware Minimizer,” in 11th International Conference on Learning Representations , Kigali, Rwanda , 2023.
Krumes A, Vladu A, Kurtic E, Lampert C, Alistarh D-A. 2023. CrAM: A Compression-Aware Minimizer. 11th International Conference on Learning Representations . ICLR: International Conference on Learning Representations.
Krumes, Alexandra, et al. “CrAM: A Compression-Aware Minimizer.” 11th International Conference on Learning Representations , OpenReview, 2023.
All files available under the following license(s):
Copyright Statement:
This Item is protected by copyright and/or related rights. [...]
Main File(s)
File Name
2023_ICLR_Peste.pdf
458.20 KB
Access Level
Open Access
Date Uploaded
2024-07-22
MD5 Checksum
a6eec897e13a91cdc3eeaf309801752c
Link(s) to Main File(s)
Access Level
Open Access
Material in ISTA:
Dissertation containing ISTA record
Export
Marked PublicationsOpen Data ISTA Research Explorer
Sources
arXiv 2207.14200