How well do sparse ImageNet models transfer?
Iofinova EB, Krumes A, Kurtz M, Alistarh D-A. 2022. How well do sparse ImageNet models transfer? 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition. CVPR: Computer Vision and Pattern Recognition, 12256–12266.
Download (ext.)
https://doi.org/10.48550/arXiv.2111.13445
[Preprint]
Conference Paper
| Published
| English
Scopus indexed
Author
Corresponding author has ISTA affiliation
Department
Grant
Abstract
Transfer learning is a classic paradigm by which models pretrained on large “upstream” datasets are adapted to yield good results on “downstream” specialized datasets. Generally, more accurate models on the “upstream” dataset tend to provide better transfer accuracy “downstream”. In this work, we perform an in-depth investigation of this phenomenon in the context of convolutional neural networks (CNNs) trained on the ImageNet dataset, which have been pruned-that is, compressed by sparsifiying their connections. We consider transfer using unstructured pruned models obtained by applying several state-of-the-art pruning methods, including magnitude-based, second-order, regrowth, lottery-ticket, and regularization approaches, in the context of twelve standard transfer tasks. In a nutshell, our study shows that sparse models can match or even outperform the transfer performance of dense models, even at high sparsities, and, while doing so, can lead to significant inference and even training speedups. At the same time, we observe and analyze significant differences in the behaviour of different pruning methods. The code is available at: https://github.com/IST-DASLab/sparse-imagenet-transfer.
Publishing Year
Date Published
2022-09-27
Proceedings Title
2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition
Publisher
Institute of Electrical and Electronics Engineers
Acknowledgement
he authors would like to sincerely thank Christoph Lampert and Nir Shavit for fruitful discussions during the development of this work, and Eldar Kurtic for experimental support. EI was supported in part by the FWF DK VGSCO, grant agreement number W1260-N35, while AP and DA acknowledge generous support by the ERC, via Starting Grant 805223 ScaleML.
Page
12256-12266
Conference
CVPR: Computer Vision and Pattern Recognition
Conference Location
New Orleans, LA, United States
Conference Date
2022-06-18 – 2022-06-24
eISSN
IST-REx-ID
Cite this
Iofinova EB, Krumes A, Kurtz M, Alistarh D-A. How well do sparse ImageNet models transfer? In: 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition. Institute of Electrical and Electronics Engineers; 2022:12256-12266. doi:10.1109/cvpr52688.2022.01195
Iofinova, E. B., Krumes, A., Kurtz, M., & Alistarh, D.-A. (2022). How well do sparse ImageNet models transfer? In 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (pp. 12256–12266). New Orleans, LA, United States: Institute of Electrical and Electronics Engineers. https://doi.org/10.1109/cvpr52688.2022.01195
Iofinova, Eugenia B, Alexandra Krumes, Mark Kurtz, and Dan-Adrian Alistarh. “How Well Do Sparse ImageNet Models Transfer?” In 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition, 12256–66. Institute of Electrical and Electronics Engineers, 2022. https://doi.org/10.1109/cvpr52688.2022.01195.
E. B. Iofinova, A. Krumes, M. Kurtz, and D.-A. Alistarh, “How well do sparse ImageNet models transfer?,” in 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition, New Orleans, LA, United States, 2022, pp. 12256–12266.
Iofinova EB, Krumes A, Kurtz M, Alistarh D-A. 2022. How well do sparse ImageNet models transfer? 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition. CVPR: Computer Vision and Pattern Recognition, 12256–12266.
Iofinova, Eugenia B., et al. “How Well Do Sparse ImageNet Models Transfer?” 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition, Institute of Electrical and Electronics Engineers, 2022, pp. 12256–66, doi:10.1109/cvpr52688.2022.01195.
All files available under the following license(s):
Copyright Statement:
This Item is protected by copyright and/or related rights. [...]
Link(s) to Main File(s)
Access Level
Open Access
Material in ISTA:
Dissertation containing ISTA record
Export
Marked PublicationsOpen Data ISTA Research Explorer
Web of Science
View record in Web of Science®Sources
arXiv 2111.13445