{"citation":{"mla":"Kurtic, Eldar, et al. “How to Prune Your Language Model: Recovering Accuracy on the ‘Sparsity May Cry’ Benchmark.” Proceedings of Machine Learning Research, vol. 234, ML Research Press, 2024, pp. 542–53.","ieee":"E. Kurtic, T. Hoefler, and D.-A. Alistarh, “How to prune your language model: Recovering accuracy on the ‘Sparsity May Cry’ benchmark,” in Proceedings of Machine Learning Research, Hongkong, China, 2024, vol. 234, pp. 542–553.","chicago":"Kurtic, Eldar, Torsten Hoefler, and Dan-Adrian Alistarh. “How to Prune Your Language Model: Recovering Accuracy on the ‘Sparsity May Cry’ Benchmark.” In Proceedings of Machine Learning Research, 234:542–53. ML Research Press, 2024.","ama":"Kurtic E, Hoefler T, Alistarh D-A. How to prune your language model: Recovering accuracy on the “Sparsity May Cry” benchmark. In: Proceedings of Machine Learning Research. Vol 234. ML Research Press; 2024:542-553.","ista":"Kurtic E, Hoefler T, Alistarh D-A. 2024. How to prune your language model: Recovering accuracy on the ‘Sparsity May Cry’ benchmark. Proceedings of Machine Learning Research. CPAL: Conference on Parsimony and Learning, PMLR, vol. 234, 542–553.","short":"E. Kurtic, T. Hoefler, D.-A. Alistarh, in:, Proceedings of Machine Learning Research, ML Research Press, 2024, pp. 542–553.","apa":"Kurtic, E., Hoefler, T., & Alistarh, D.-A. (2024). How to prune your language model: Recovering accuracy on the “Sparsity May Cry” benchmark. In Proceedings of Machine Learning Research (Vol. 234, pp. 542–553). Hongkong, China: ML Research Press."},"department":[{"_id":"DaAl"}],"oa_version":"Preprint","title":"How to prune your language model: Recovering accuracy on the \"Sparsity May Cry\" benchmark","external_id":{"arxiv":["2312.13547"]},"scopus_import":"1","intvolume":" 234","date_updated":"2024-10-09T21:08:16Z","language":[{"iso":"eng"}],"abstract":[{"text":"Pruning large language models (LLMs) from the BERT family has emerged as a standard compression benchmark, and several pruning methods have been proposed for this task. The recent “Sparsity May Cry” (SMC) benchmark put into question the validity of all existing methods, exhibiting a more complex setup where many known pruning methods appear to fail. We revisit the question of accurate BERT-pruning during fine-tuning on downstream datasets, and propose a set of general guidelines for successful pruning, even on the challenging SMC benchmark. First, we perform a cost-vs-benefits analysis of pruning model components, such as the embeddings and the classification head; second, we provide a simple-yet-general way of scaling training, sparsification and learning rate schedules relative to the desired target sparsity; finally, we investigate the importance of proper parametrization for Knowledge Distillation in the context of LLMs. Our simple insights lead to state-of-the-art results, both on classic BERT-pruning benchmarks, as well as on the SMC benchmark, showing that even classic gradual magnitude pruning (GMP) can yield competitive results, with the right approach.","lang":"eng"}],"day":"08","article_processing_charge":"No","quality_controlled":"1","date_created":"2024-02-18T23:01:03Z","volume":234,"publisher":"ML Research Press","status":"public","publication":"Proceedings of Machine Learning Research","month":"01","corr_author":"1","arxiv":1,"year":"2024","user_id":"2DF688A6-F248-11E8-B48F-1D18A9856A87","alternative_title":["PMLR"],"main_file_link":[{"url":"https://proceedings.mlr.press/v234/kurtic24a","open_access":"1"}],"publication_status":"published","_id":"15011","page":"542-553","author":[{"last_name":"Kurtic","id":"47beb3a5-07b5-11eb-9b87-b108ec578218","first_name":"Eldar","full_name":"Kurtic, Eldar"},{"last_name":"Hoefler","first_name":"Torsten","full_name":"Hoefler, Torsten"},{"orcid":"0000-0003-3650-940X","last_name":"Alistarh","first_name":"Dan-Adrian","id":"4A899BFC-F248-11E8-B48F-1D18A9856A87","full_name":"Alistarh, Dan-Adrian"}],"date_published":"2024-01-08T00:00:00Z","publication_identifier":{"eissn":["2640-3498"]},"type":"conference","conference":{"name":"CPAL: Conference on Parsimony and Learning","end_date":"2024-01-06","location":"Hongkong, China","start_date":"2024-01-03"},"oa":1}