{"publisher":"IEEE","title":"Gradient compression for communication-limited convex optimization","user_id":"c635000d-4b10-11ee-a964-aac5a93f6ac1","month":"01","date_updated":"2023-09-06T11:14:55Z","publication":"2018 IEEE Conference on Decision and Control","day":"21","publication_status":"published","department":[{"_id":"DaAl"}],"doi":"10.1109/cdc.2018.8619625","status":"public","article_processing_charge":"No","conference":{"name":"CDC: Conference on Decision and Control","end_date":"2018-12-19","location":"Miami Beach, FL, United States","start_date":"2018-12-17"},"scopus_import":"1","date_created":"2019-11-26T15:07:49Z","author":[{"first_name":"Sarit","last_name":"Khirirat","full_name":"Khirirat, Sarit"},{"last_name":"Johansson","full_name":"Johansson, Mikael","first_name":"Mikael"},{"orcid":"0000-0003-3650-940X","id":"4A899BFC-F248-11E8-B48F-1D18A9856A87","last_name":"Alistarh","full_name":"Alistarh, Dan-Adrian","first_name":"Dan-Adrian"}],"abstract":[{"text":"Data-rich applications in machine-learning and control have motivated an intense research on large-scale optimization. Novel algorithms have been proposed and shown to have optimal convergence rates in terms of iteration counts. However, their practical performance is severely degraded by the cost of exchanging high-dimensional gradient vectors between computing nodes. Several gradient compression heuristics have recently been proposed to reduce communications, but few theoretical results exist that quantify how they impact algorithm convergence. This paper establishes and strengthens the convergence guarantees for gradient descent under a family of gradient compression techniques. For convex optimization problems, we derive admissible step sizes and quantify both the number of iterations and the number of bits that need to be exchanged to reach a target accuracy. Finally, we validate the performance of different gradient compression techniques in simulations. The numerical results highlight the properties of different gradient compression algorithms and confirm that fast convergence with limited information exchange is possible.","lang":"eng"}],"citation":{"ama":"Khirirat S, Johansson M, Alistarh D-A. Gradient compression for communication-limited convex optimization. In: 2018 IEEE Conference on Decision and Control. IEEE; 2019. doi:10.1109/cdc.2018.8619625","ieee":"S. Khirirat, M. Johansson, and D.-A. Alistarh, “Gradient compression for communication-limited convex optimization,” in 2018 IEEE Conference on Decision and Control, Miami Beach, FL, United States, 2019.","ista":"Khirirat S, Johansson M, Alistarh D-A. 2019. Gradient compression for communication-limited convex optimization. 2018 IEEE Conference on Decision and Control. CDC: Conference on Decision and Control, 8619625.","short":"S. Khirirat, M. Johansson, D.-A. Alistarh, in:, 2018 IEEE Conference on Decision and Control, IEEE, 2019.","mla":"Khirirat, Sarit, et al. “Gradient Compression for Communication-Limited Convex Optimization.” 2018 IEEE Conference on Decision and Control, 8619625, IEEE, 2019, doi:10.1109/cdc.2018.8619625.","apa":"Khirirat, S., Johansson, M., & Alistarh, D.-A. (2019). Gradient compression for communication-limited convex optimization. In 2018 IEEE Conference on Decision and Control. Miami Beach, FL, United States: IEEE. https://doi.org/10.1109/cdc.2018.8619625","chicago":"Khirirat, Sarit, Mikael Johansson, and Dan-Adrian Alistarh. “Gradient Compression for Communication-Limited Convex Optimization.” In 2018 IEEE Conference on Decision and Control. IEEE, 2019. https://doi.org/10.1109/cdc.2018.8619625."},"external_id":{"isi":["000458114800023"]},"_id":"7122","publication_identifier":{"isbn":["9781538613955"],"issn":["0743-1546"]},"language":[{"iso":"eng"}],"article_number":"8619625","type":"conference","quality_controlled":"1","year":"2019","isi":1,"oa_version":"None","date_published":"2019-01-21T00:00:00Z"}