7 Publications

Mark all

[7]
2024 | Published | Conference Paper | IST-REx-ID: 15011 | OA
Kurtic, Eldar, Torsten Hoefler, and Dan-Adrian Alistarh. “How to Prune Your Language Model: Recovering Accuracy on the ‘Sparsity May Cry’ Benchmark.” In Proceedings of Machine Learning Research, 234:542–53. ML Research Press, 2024.
[Preprint] View | Download Preprint (ext.) | arXiv
 
[6]
2024 | Published | Conference Paper | IST-REx-ID: 18975 | OA
Modoranu, Ionut-Vlad, Aleksei Kalinov, Eldar Kurtic, Elias Frantar, and Dan-Adrian Alistarh. “Error Feedback Can Accurately Compress Preconditioners.” In 41st International Conference on Machine Learning, 235:35910–33. ML Research Press, 2024.
[Preprint] View | Download Preprint (ext.) | arXiv
 
[5]
2024 | Published | Conference Paper | IST-REx-ID: 19510 | OA
Modoranu, Ionut-Vlad, Mher Safaryan, Grigory Malinovsky, Eldar Kurtic, Thomas Robert, Peter Richtárik, and Dan-Adrian Alistarh. “MICROADAM: Accurate Adaptive Optimization with Low Space Overhead and Provable Convergence.” In 38th Conference on Neural Information Processing Systems, Vol. 37. Neural Information Processing Systems Foundation, 2024.
[Preprint] View | Files available | Download Preprint (ext.) | arXiv
 
[4]
2023 | Published | Conference Paper | IST-REx-ID: 14460 | OA
Nikdan, Mahdi, Tommaso Pegolotti, Eugenia B Iofinova, Eldar Kurtic, and Dan-Adrian Alistarh. “SparseProp: Efficient Sparse Backpropagation for Faster Training of Neural Networks at the Edge.” In Proceedings of the 40th International Conference on Machine Learning, 202:26215–27. ML Research Press, 2023.
[Preprint] View | Download Preprint (ext.) | arXiv
 
[3]
2023 | Published | Conference Paper | IST-REx-ID: 13053 | OA
Krumes, Alexandra, Adrian Vladu, Eldar Kurtic, Christoph Lampert, and Dan-Adrian Alistarh. “CrAM: A Compression-Aware Minimizer.” In 11th International Conference on Learning Representations . OpenReview, 2023.
[Published Version] View | Files available | Download Published Version (ext.) | arXiv
 
[2]
2022 | Published | Conference Paper | IST-REx-ID: 17088 | OA
Kurtic, Eldar, Daniel Campos, Tuan Nguyen, Elias Frantar, Mark Kurtz, Benjamin Fineran, Michael Goin, and Dan-Adrian Alistarh. “The Optimal BERT Surgeon: Scalable and Accurate Second-Order Pruning for Large Language Models.” In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, 4163–81. Association for Computational Linguistics, 2022. https://doi.org/10.18653/v1/2022.emnlp-main.279.
[Published Version] View | Files available | DOI | arXiv
 
[1]
2021 | Published | Conference Paper | IST-REx-ID: 11463 | OA
Frantar, Elias, Eldar Kurtic, and Dan-Adrian Alistarh. “M-FAC: Efficient Matrix-Free Approximations of Second-Order Information.” In 35th Conference on Neural Information Processing Systems, 34:14873–86. Neural Information Processing Systems Foundation, 2021.
[Published Version] View | Download Published Version (ext.) | arXiv
 

Search

Filter Publications

Display / Sort

Citation Style: Chicago

Export / Embed

Grants


7 Publications

Mark all

[7]
2024 | Published | Conference Paper | IST-REx-ID: 15011 | OA
Kurtic, Eldar, Torsten Hoefler, and Dan-Adrian Alistarh. “How to Prune Your Language Model: Recovering Accuracy on the ‘Sparsity May Cry’ Benchmark.” In Proceedings of Machine Learning Research, 234:542–53. ML Research Press, 2024.
[Preprint] View | Download Preprint (ext.) | arXiv
 
[6]
2024 | Published | Conference Paper | IST-REx-ID: 18975 | OA
Modoranu, Ionut-Vlad, Aleksei Kalinov, Eldar Kurtic, Elias Frantar, and Dan-Adrian Alistarh. “Error Feedback Can Accurately Compress Preconditioners.” In 41st International Conference on Machine Learning, 235:35910–33. ML Research Press, 2024.
[Preprint] View | Download Preprint (ext.) | arXiv
 
[5]
2024 | Published | Conference Paper | IST-REx-ID: 19510 | OA
Modoranu, Ionut-Vlad, Mher Safaryan, Grigory Malinovsky, Eldar Kurtic, Thomas Robert, Peter Richtárik, and Dan-Adrian Alistarh. “MICROADAM: Accurate Adaptive Optimization with Low Space Overhead and Provable Convergence.” In 38th Conference on Neural Information Processing Systems, Vol. 37. Neural Information Processing Systems Foundation, 2024.
[Preprint] View | Files available | Download Preprint (ext.) | arXiv
 
[4]
2023 | Published | Conference Paper | IST-REx-ID: 14460 | OA
Nikdan, Mahdi, Tommaso Pegolotti, Eugenia B Iofinova, Eldar Kurtic, and Dan-Adrian Alistarh. “SparseProp: Efficient Sparse Backpropagation for Faster Training of Neural Networks at the Edge.” In Proceedings of the 40th International Conference on Machine Learning, 202:26215–27. ML Research Press, 2023.
[Preprint] View | Download Preprint (ext.) | arXiv
 
[3]
2023 | Published | Conference Paper | IST-REx-ID: 13053 | OA
Krumes, Alexandra, Adrian Vladu, Eldar Kurtic, Christoph Lampert, and Dan-Adrian Alistarh. “CrAM: A Compression-Aware Minimizer.” In 11th International Conference on Learning Representations . OpenReview, 2023.
[Published Version] View | Files available | Download Published Version (ext.) | arXiv
 
[2]
2022 | Published | Conference Paper | IST-REx-ID: 17088 | OA
Kurtic, Eldar, Daniel Campos, Tuan Nguyen, Elias Frantar, Mark Kurtz, Benjamin Fineran, Michael Goin, and Dan-Adrian Alistarh. “The Optimal BERT Surgeon: Scalable and Accurate Second-Order Pruning for Large Language Models.” In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, 4163–81. Association for Computational Linguistics, 2022. https://doi.org/10.18653/v1/2022.emnlp-main.279.
[Published Version] View | Files available | DOI | arXiv
 
[1]
2021 | Published | Conference Paper | IST-REx-ID: 11463 | OA
Frantar, Elias, Eldar Kurtic, and Dan-Adrian Alistarh. “M-FAC: Efficient Matrix-Free Approximations of Second-Order Information.” In 35th Conference on Neural Information Processing Systems, 34:14873–86. Neural Information Processing Systems Foundation, 2021.
[Published Version] View | Download Published Version (ext.) | arXiv
 

Search

Filter Publications

Display / Sort

Citation Style: Chicago

Export / Embed