Hybrid decentralized optimization: Leveraging both first- and zeroth-order optimizers for faster convergence
Talaei S, Ansaripour M, Nadiradze G, Alistarh D-A. 2025. Hybrid decentralized optimization: Leveraging both first- and zeroth-order optimizers for faster convergence. Proceedings of the39th AAAI Conference on Artificial Intelligence. 39(19), 20778–20786.
Download (ext.)
Journal Article
| Published
| English
Scopus indexed
Author
Corresponding author has ISTA affiliation
Department
Abstract
Distributed optimization is the standard way of speeding up machine learning training, and most of the research in the area focuses on distributed first-order, gradient-based methods. Yet, there are settings where some computationally-bounded nodes may not be able to implement first-order, gradient-based optimization, while they could still contribute to joint optimization tasks. In this paper, we initiate the study of hybrid decentralized optimization, studying settings where nodes with zeroth-order and first-order optimization capabilities co-exist in a distributed system, and attempt to jointly solve an optimization task over some data distribution. We essentially show that, under reasonable parameter settings, such a system can not only withstand noisier zeroth-order agents but can even benefit from integrating such agents into the optimization process, rather than ignoring their information. At the core of our approach is a new analysis of distributed optimization with noisy and possibly-biased gradient estimators, which may be of independent interest. Our results hold for both convex and non-convex objectives. Experimental results on standard optimization tasks confirm our analysis, showing that hybrid first-zeroth order optimization can be practical, even when training deep neural networks.
Publishing Year
Date Published
2025-04-11
Journal Title
Proceedings of the39th AAAI Conference on Artificial Intelligence
Publisher
Association for the Advancement of Artificial Intelligence
Acknowledgement
This project has received funding from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (grant agreement
No 805223 ScaleML). The authors would like to acknowledge Eugenia Iofinova for useful discussions during the inception of this project.
Volume
39
Issue
19
Page
20778-20786
ISSN
eISSN
IST-REx-ID
Cite this
Talaei S, Ansaripour M, Nadiradze G, Alistarh D-A. Hybrid decentralized optimization: Leveraging both first- and zeroth-order optimizers for faster convergence. Proceedings of the39th AAAI Conference on Artificial Intelligence. 2025;39(19):20778-20786. doi:10.1609/aaai.v39i19.34290
Talaei, S., Ansaripour, M., Nadiradze, G., & Alistarh, D.-A. (2025). Hybrid decentralized optimization: Leveraging both first- and zeroth-order optimizers for faster convergence. Proceedings of The39th AAAI Conference on Artificial Intelligence. Association for the Advancement of Artificial Intelligence. https://doi.org/10.1609/aaai.v39i19.34290
Talaei, Shayan, Matin Ansaripour, Giorgi Nadiradze, and Dan-Adrian Alistarh. “Hybrid Decentralized Optimization: Leveraging Both First- and Zeroth-Order Optimizers for Faster Convergence.” Proceedings of The39th AAAI Conference on Artificial Intelligence. Association for the Advancement of Artificial Intelligence, 2025. https://doi.org/10.1609/aaai.v39i19.34290.
S. Talaei, M. Ansaripour, G. Nadiradze, and D.-A. Alistarh, “Hybrid decentralized optimization: Leveraging both first- and zeroth-order optimizers for faster convergence,” Proceedings of the39th AAAI Conference on Artificial Intelligence, vol. 39, no. 19. Association for the Advancement of Artificial Intelligence, pp. 20778–20786, 2025.
Talaei S, Ansaripour M, Nadiradze G, Alistarh D-A. 2025. Hybrid decentralized optimization: Leveraging both first- and zeroth-order optimizers for faster convergence. Proceedings of the39th AAAI Conference on Artificial Intelligence. 39(19), 20778–20786.
Talaei, Shayan, et al. “Hybrid Decentralized Optimization: Leveraging Both First- and Zeroth-Order Optimizers for Faster Convergence.” Proceedings of The39th AAAI Conference on Artificial Intelligence, vol. 39, no. 19, Association for the Advancement of Artificial Intelligence, 2025, pp. 20778–86, doi:10.1609/aaai.v39i19.34290.
All files available under the following license(s):
Copyright Statement:
This Item is protected by copyright and/or related rights. [...]
Link(s) to Main File(s)
Access Level

Export
Marked PublicationsOpen Data ISTA Research Explorer
Sources
arXiv 2210.07703