Please note that LibreCat no longer supports Internet Explorer versions 8 or 9 (or earlier).

We recommend upgrading to the latest Internet Explorer, Google Chrome, or Firefox.




141 Publications

2024 | Published | Conference Paper | IST-REx-ID: 17093 | OA
Zakerinia, Hossein, Shayan Talaei, Giorgi Nadiradze, and Dan-Adrian Alistarh. “Communication-Efficient Federated Learning with Data and Client Heterogeneity.” In Proceedings of the 27th International Conference on Artificial Intelligence and Statistics, 238:3448–56. ML Research Press, 2024.
[Preprint] View | Download Preprint (ext.) | arXiv
 
2024 | Published | Conference Paper | IST-REx-ID: 17329 | OA
Alistarh, Dan-Adrian, Krishnendu Chatterjee, Mehrdad Karrabi, and John M Lazarsfeld. “Game Dynamics and Equilibrium Computation in the Population Protocol Model.” In Proceedings of the 43rd Annual ACM Symposium on Principles of Distributed Computing, 40–49. Association for Computing Machinery, 2024. https://doi.org/10.1145/3662158.3662768.
[Published Version] View | Files available | DOI
 
2024 | Published | Conference Paper | IST-REx-ID: 17332 | OA
Kokorin, Ilya, Victor Yudov, Vitaly Aksenov, and Dan-Adrian Alistarh. “Wait-Free Trees with Asymptotically-Efficient Range Queries.” In 2024 IEEE International Parallel and Distributed Processing Symposium, 169–79. IEEE, 2024. https://doi.org/10.1109/IPDPS57955.2024.00023.
[Preprint] View | DOI | Download Preprint (ext.) | arXiv
 
2024 | Published | Conference Paper | IST-REx-ID: 17456 | OA
Markov, Ilia, Kaveh Alimohammadi, Elias Frantar, and Dan-Adrian Alistarh. “L-GreCo: Layerwise-Adaptive Gradient Compression for Efficient Data-Parallel Deep Learning.” In Proceedings of Machine Learning and Systems , edited by P. Gibbons, G. Pekhimenko, and C. De Sa, Vol. 6. Association for Computing Machinery, 2024.
[Published Version] View | Files available | Download Published Version (ext.) | arXiv
 
2024 | Published | Thesis | IST-REx-ID: 17485 | OA
Frantar, Elias. “Compressing Large Neural Networks : Algorithms, Systems and Scaling Laws.” Institute of Science and Technology Austria, 2024. https://doi.org/10.15479/at:ista:17485.
[Published Version] View | Files available | DOI
 
2024 | Published | Thesis | IST-REx-ID: 17490 | OA
Markov, Ilia. “Communication-Efficient Distributed Training of Deep Neural Networks: An Algorithms and Systems Perspective.” Institute of Science and Technology Austria, 2024. https://doi.org/10.15479/at:ista:17490.
[Published Version] View | Files available | DOI
 
2024 | Published | Conference Paper | IST-REx-ID: 18061 | OA
Frantar, Elias, and Dan-Adrian Alistarh. “QMoE: Sub-1-Bit Compression of Trillion Parameter Models.” In Proceedings of Machine Learning and Systems, edited by P. Gibbons, G. Pekhimenko, and C. De Sa, Vol. 6, 2024.
[Published Version] View | Files available | Download Published Version (ext.)
 
2024 | Published | Conference Paper | IST-REx-ID: 18062 | OA
Frantar, Elias, Carlos Riquelme Ruiz, Neil Houlsby, Dan-Adrian Alistarh, and Utku Evci. “Scaling Laws for Sparsely-Connected Foundation Models.” In The Twelfth International Conference on Learning Representations, 2024.
[Published Version] View | Files available | Download Published Version (ext.) | arXiv
 
2024 | Published | Conference Paper | IST-REx-ID: 18070
Chatterjee, Bapi, Vyacheslav Kungurtsev, and Dan-Adrian Alistarh. “Federated SGD with Local Asynchrony.” In Proceedings of the 44th International Conference on Distributed Computing Systems, 857–68. IEEE, 2024. https://doi.org/10.1109/ICDCS60910.2024.00084.
View | DOI
 
2024 | Published | Conference Paper | IST-REx-ID: 18113 | OA
Egiazarian, Vage, Andrei Panferov, Denis Kuznedelev, Elias Frantar, Artem Babenko, and Dan-Adrian Alistarh. “Extreme Compression of Large Language Models via Additive Quantization.” In Proceedings of the 41st International Conference on Machine Learning, 235:12284–303. ML Research Press, 2024.
[Preprint] View | Download Preprint (ext.) | arXiv
 
2024 | Published | Conference Paper | IST-REx-ID: 18117 | OA
Nikdan, Mahdi, Soroush Tabesh, Elvir Crncevic, and Dan-Adrian Alistarh. “RoSA: Accurate Parameter-Efficient Fine-Tuning via Robust Adaptation.” In Proceedings of the 41st International Conference on Machine Learning, 235:38187–206. ML Research Press, 2024.
[Preprint] View | Files available | Download Preprint (ext.) | arXiv
 
2024 | Published | Conference Paper | IST-REx-ID: 18121 | OA
Moakhar, Arshia Soltani, Eugenia B Iofinova, Elias Frantar, and Dan-Adrian Alistarh. “SPADE: Sparsity-Guided Debugging for Deep Neural Networks.” In Proceedings of the 41st International Conference on Machine Learning, 235:45955–87. ML Research Press, 2024.
[Preprint] View | Files available | Download Preprint (ext.) | arXiv
 
2024 | Published | Conference Paper | IST-REx-ID: 15011 | OA
Kurtic, Eldar, Torsten Hoefler, and Dan-Adrian Alistarh. “How to Prune Your Language Model: Recovering Accuracy on the ‘Sparsity May Cry’ Benchmark.” In Proceedings of Machine Learning Research, 234:542–53. ML Research Press, 2024.
[Preprint] View | Download Preprint (ext.) | arXiv
 
2024 | Published | Conference Paper | IST-REx-ID: 17469 | OA
Kögler, Kevin, Alexander Shevchenko, Hamed Hassani, and Marco Mondelli. “Compression of Structured Data with Autoencoders: Provable Benefit of Nonlinearities and Depth.” In Proceedings of the 41st International Conference on Machine Learning, 235:24964–15. ML Research Press, 2024.
[Published Version] View | Files available | Download Published Version (ext.) | arXiv
 
2024 | Published | Thesis | IST-REx-ID: 17465 | OA
Shevchenko, Alexander. “High-Dimensional Limits in Artificial Neural Networks.” Institute of Science and Technology Austria, 2024. https://doi.org/10.15479/at:ista:17465.
[Published Version] View | Files available | DOI
 
2023 | Published | Conference Paper | IST-REx-ID: 17378 | OA
Frantar, Elias, Saleh Ashkboos, Torsten Hoefler, and Dan-Adrian Alistarh. “OPTQ: Accurate Post-Training Quantization for Generative Pre-Trained Transformers.” In 11th International Conference on Learning Representations . International Conference on Learning Representations, 2023.
[Published Version] View | Files available
 
2023 | Published | Journal Article | IST-REx-ID: 12330 | OA
Aksenov, Vitalii, Dan-Adrian Alistarh, Alexandra Drozdova, and Amirkeivan Mohtashami. “The Splay-List: A Distribution-Adaptive Concurrent Skip-List.” Distributed Computing. Springer Nature, 2023. https://doi.org/10.1007/s00446-022-00441-x.
[Preprint] View | DOI | Download Preprint (ext.) | WoS | arXiv
 
2023 | Published | Journal Article | IST-REx-ID: 12566 | OA
Alistarh, Dan-Adrian, Faith Ellen, and Joel Rybicki. “Wait-Free Approximate Agreement on Graphs.” Theoretical Computer Science. Elsevier, 2023. https://doi.org/10.1016/j.tcs.2023.113733.
[Published Version] View | Files available | DOI | WoS
 
2023 | Published | Conference Paper | IST-REx-ID: 12735 | OA
Koval, Nikita, Dan-Adrian Alistarh, and Roman Elizarov. “Fast and Scalable Channels in Kotlin Coroutines.” In Proceedings of the ACM SIGPLAN Symposium on Principles and Practice of Parallel Programming, 107–18. Association for Computing Machinery, 2023. https://doi.org/10.1145/3572848.3577481.
[Preprint] View | DOI | Download Preprint (ext.) | arXiv
 
2023 | Published | Conference Poster | IST-REx-ID: 12736 | OA
Aksenov, Vitaly, Trevor A Brown, Alexander Fedorov, and Ilya Kokorin. Unexpected Scaling in Path Copying Trees. Proceedings of the ACM SIGPLAN Symposium on Principles and Practice of Parallel Programming. Association for Computing Machinery, 2023. https://doi.org/10.1145/3572848.3577512.
[Published Version] View | DOI | Download Published Version (ext.)
 

Search

Filter Publications

Display / Sort

Citation Style: Chicago

Export / Embed