Please note that ISTA Research Explorer no longer supports Internet Explorer versions 8 or 9 (or earlier).

We recommend upgrading to the latest Internet Explorer, Google Chrome, or Firefox.

138 Publications


2024 |Published| Conference Paper | IST-REx-ID: 15011 | OA
Kurtic, E., Hoefler, T., & Alistarh, D.-A. (2024). How to prune your language model: Recovering accuracy on the “Sparsity May Cry” benchmark. In Proceedings of Machine Learning Research (Vol. 234, pp. 542–553). Hongkong, China: ML Research Press.
[Preprint] View | Download Preprint (ext.) | arXiv
 

2024 |Published| Conference Paper | IST-REx-ID: 17093 | OA
Zakerinia, H., Talaei, S., Nadiradze, G., & Alistarh, D.-A. (2024). Communication-efficient federated learning with data and client heterogeneity. In Proceedings of the 27th International Conference on Artificial Intelligence and Statistics (Vol. 238, pp. 3448–3456). Valencia, Spain: ML Research Press.
[Preprint] View | Download Preprint (ext.) | arXiv
 

2024 |Published| Conference Paper | IST-REx-ID: 17329 | OA
Alistarh, D.-A., Chatterjee, K., Karrabi, M., & Lazarsfeld, J. M. (2024). Game dynamics and equilibrium computation in the population protocol model. In Proceedings of the 43rd Annual ACM Symposium on Principles of Distributed Computing (pp. 40–49). Nantes, France: Association for Computing Machinery. https://doi.org/10.1145/3662158.3662768
[Published Version] View | Files available | DOI
 

2024 |Published| Conference Paper | IST-REx-ID: 17332 | OA
Kokorin, I., Yudov, V., Aksenov, V., & Alistarh, D.-A. (2024). Wait-free trees with asymptotically-efficient range queries. In 2024 IEEE International Parallel and Distributed Processing Symposium (pp. 169–179). San Francisco, CA, United States: IEEE. https://doi.org/10.1109/IPDPS57955.2024.00023
[Preprint] View | DOI | Download Preprint (ext.) | arXiv
 

2024 |Published| Conference Paper | IST-REx-ID: 17469 | OA
Kögler, K., Shevchenko, A., Hassani, H., & Mondelli, M. (2024). Compression of structured data with autoencoders: Provable benefit of nonlinearities and depth. In Proceedings of the 41st International Conference on Machine Learning (Vol. 235, pp. 24964–25015). Vienna, Austria: ML Research Press.
[Published Version] View | Files available | Download Published Version (ext.) | arXiv
 

2024 |Published| Thesis | IST-REx-ID: 17465
Shevchenko, A. (2024). High-dimensional limits in artificial neural networks. Institute of Science and Technology Austria. https://doi.org/10.15479/at:ista:17465
[Published Version] View | Files available | DOI
 

2024 |Published| Thesis | IST-REx-ID: 17490 | OA
Markov, I. (2024). Communication-efficient distributed training of deep neural networks: An algorithms and systems perspective. Institute of Science and Technology Austria. https://doi.org/10.15479/at:ista:17490
[Published Version] View | Files available | DOI
 

2024 |Published| Conference Paper | IST-REx-ID: 17456 | OA
Markov, I., Alimohammadi, K., Frantar, E., & Alistarh, D.-A. (2024). L-GreCo: Layerwise-adaptive gradient compression for efficient data-parallel deep learning. In P. Gibbons, G. Pekhimenko, & C. De Sa (Eds.), Proceedings of Machine Learning and Systems (Vol. 6). Athens, Greece: Association for Computing Machinery.
[Published Version] View | Files available | Download Published Version (ext.) | arXiv
 

2024 |Published| Conference Paper | IST-REx-ID: 18070
Chatterjee, B., Kungurtsev, V., & Alistarh, D.-A. (2024). Federated SGD with local asynchrony. In Proceedings of the 44th International Conference on Distributed Computing Systems (pp. 857–868). Jersey City, NJ, United States: IEEE. https://doi.org/10.1109/ICDCS60910.2024.00084
View | DOI
 

2024 |Published| Thesis | IST-REx-ID: 17485 | OA
Frantar, E. (2024). Compressing large neural networks : Algorithms, systems and scaling laws. Institute of Science and Technology Austria. https://doi.org/10.15479/at:ista:17485
[Published Version] View | Files available | DOI
 

2024 |Published| Conference Paper | IST-REx-ID: 18061 | OA
Frantar, E., & Alistarh, D.-A. (2024). QMoE: Sub-1-bit compression of trillion parameter models. In P. Gibbons, G. Pekhimenko, & C. De Sa (Eds.), Proceedings of Machine Learning and Systems (Vol. 6). Santa Clara, CA, USA.
[Published Version] View | Files available | Download Published Version (ext.)
 

2024 |Published| Conference Paper | IST-REx-ID: 18062 | OA
Frantar, E., Ruiz, C. R., Houlsby, N., Alistarh, D.-A., & Evci, U. (2024). Scaling laws for sparsely-connected foundation models. In The Twelfth International Conference on Learning Representations. Vienna, Austria.
[Published Version] View | Files available | Download Published Version (ext.) | arXiv
 

2023 |Published| Conference Paper | IST-REx-ID: 12735 | OA
Koval, N., Alistarh, D.-A., & Elizarov, R. (2023). Fast and scalable channels in Kotlin Coroutines. In Proceedings of the ACM SIGPLAN Symposium on Principles and Practice of Parallel Programming (pp. 107–118). Montreal, QC, Canada: Association for Computing Machinery. https://doi.org/10.1145/3572848.3577481
[Preprint] View | DOI | Download Preprint (ext.) | arXiv
 

2023 |Published| Conference Poster | IST-REx-ID: 12736 | OA
Aksenov, V., Brown, T. A., Fedorov, A., & Kokorin, I. (2023). Unexpected scaling in path copying trees. Proceedings of the ACM SIGPLAN Symposium on Principles and Practice of Parallel Programming (pp. 438–440). Montreal, QB, Canada: Association for Computing Machinery. https://doi.org/10.1145/3572848.3577512
[Published Version] View | DOI | Download Published Version (ext.)
 

2023 |Published| Journal Article | IST-REx-ID: 13179 | OA
Koval, N., Khalanskiy, D., & Alistarh, D.-A. (2023). CQS: A formally-verified framework for fair and abortable synchronization. Proceedings of the ACM on Programming Languages. Association for Computing Machinery . https://doi.org/10.1145/3591230
[Published Version] View | Files available | DOI
 

2023 |Published| Journal Article | IST-REx-ID: 12566 | OA
Alistarh, D.-A., Ellen, F., & Rybicki, J. (2023). Wait-free approximate agreement on graphs. Theoretical Computer Science. Elsevier. https://doi.org/10.1016/j.tcs.2023.113733
[Published Version] View | Files available | DOI | WoS
 

2023 |Published| Journal Article | IST-REx-ID: 12330 | OA
Aksenov, V., Alistarh, D.-A., Drozdova, A., & Mohtashami, A. (2023). The splay-list: A distribution-adaptive concurrent skip-list. Distributed Computing. Springer Nature. https://doi.org/10.1007/s00446-022-00441-x
[Preprint] View | DOI | Download Preprint (ext.) | WoS | arXiv
 

2023 |Published| Conference Paper | IST-REx-ID: 14460 | OA
Nikdan, M., Pegolotti, T., Iofinova, E. B., Kurtic, E., & Alistarh, D.-A. (2023). SparseProp: Efficient sparse backpropagation for faster training of neural networks at the edge. In Proceedings of the 40th International Conference on Machine Learning (Vol. 202, pp. 26215–26227). Honolulu, Hawaii, HI, United States: ML Research Press.
[Preprint] View | Download Preprint (ext.) | arXiv
 

2023 |Published| Journal Article | IST-REx-ID: 14364 | OA
Alistarh, D.-A., Aspnes, J., Ellen, F., Gelashvili, R., & Zhu, L. (2023). Why extension-based proofs fail. SIAM Journal on Computing. Society for Industrial and Applied Mathematics. https://doi.org/10.1137/20M1375851
[Preprint] View | Files available | DOI | Download Preprint (ext.) | WoS | arXiv
 

2023 |Published| Conference Paper | IST-REx-ID: 14771 | OA
Iofinova, E. B., Peste, E.-A., & Alistarh, D.-A. (2023). Bias in pruned vision models: In-depth analysis and countermeasures. In 2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition (pp. 24364–24373). Vancouver, BC, Canada: IEEE. https://doi.org/10.1109/cvpr52729.2023.02334
[Preprint] View | Files available | DOI | Download Preprint (ext.) | WoS | arXiv
 

Filters and Search Terms

department=DaAl

Search

Filter Publications