{"_id":"20302","alternative_title":["PMLR"],"oa":1,"OA_type":"green","publication_status":"published","scopus_import":"1","title":"Revisiting LocalSGD and SCAFFOLD: Improved rates and missing analysis","external_id":{"arxiv":["2501.04443"]},"user_id":"2DF688A6-F248-11E8-B48F-1D18A9856A87","quality_controlled":"1","page":"2539-2547","status":"public","language":[{"iso":"eng"}],"project":[{"_id":"0599E47C-7A3F-11EA-A408-12923DDC885E","grant_number":"863818","name":"Formal Methods for Stochastic Models: Algorithms and Applications","call_identifier":"H2020"}],"day":"01","author":[{"last_name":"Luo","full_name":"Luo, Ruichen","first_name":"Ruichen","id":"b391db08-1ffe-11ee-8b67-d18ddcfb5a14"},{"last_name":"Stich","full_name":"Stich, Sebastian U.","first_name":"Sebastian U."},{"first_name":"Samuel","full_name":"Horváth, Samuel","last_name":"Horváth"},{"first_name":"Martin","full_name":"Takáč, Martin","last_name":"Takáč"}],"article_processing_charge":"No","type":"conference","citation":{"ista":"Luo R, Stich SU, Horváth S, Takáč M. 2025. Revisiting LocalSGD and SCAFFOLD: Improved rates and missing analysis. The 28th International Conference on Artificial Intelligence and Statistics. AISTATS: Conference on Artificial Intelligence and Statistics, PMLR, vol. 258, 2539–2547.","short":"R. Luo, S.U. Stich, S. Horváth, M. Takáč, in:, The 28th International Conference on Artificial Intelligence and Statistics, ML Research Press, 2025, pp. 2539–2547.","mla":"Luo, Ruichen, et al. “Revisiting LocalSGD and SCAFFOLD: Improved Rates and Missing Analysis.” The 28th International Conference on Artificial Intelligence and Statistics, vol. 258, ML Research Press, 2025, pp. 2539–47.","chicago":"Luo, Ruichen, Sebastian U. Stich, Samuel Horváth, and Martin Takáč. “Revisiting LocalSGD and SCAFFOLD: Improved Rates and Missing Analysis.” In The 28th International Conference on Artificial Intelligence and Statistics, 258:2539–47. ML Research Press, 2025.","apa":"Luo, R., Stich, S. U., Horváth, S., & Takáč, M. (2025). Revisiting LocalSGD and SCAFFOLD: Improved rates and missing analysis. In The 28th International Conference on Artificial Intelligence and Statistics (Vol. 258, pp. 2539–2547). Mai Khao, Thailand: ML Research Press.","ieee":"R. Luo, S. U. Stich, S. Horváth, and M. Takáč, “Revisiting LocalSGD and SCAFFOLD: Improved rates and missing analysis,” in The 28th International Conference on Artificial Intelligence and Statistics, Mai Khao, Thailand, 2025, vol. 258, pp. 2539–2547.","ama":"Luo R, Stich SU, Horváth S, Takáč M. Revisiting LocalSGD and SCAFFOLD: Improved rates and missing analysis. In: The 28th International Conference on Artificial Intelligence and Statistics. Vol 258. ML Research Press; 2025:2539-2547."},"publication_identifier":{"eissn":["2640-3498"]},"date_published":"2025-05-01T00:00:00Z","department":[{"_id":"KrCh"}],"date_updated":"2025-09-09T07:17:08Z","date_created":"2025-09-07T22:01:35Z","year":"2025","main_file_link":[{"url":"https://doi.org/10.48550/arXiv.2501.04443","open_access":"1"}],"abstract":[{"lang":"eng","text":"LocalSGD and SCAFFOLD are widely used methods in distributed stochastic optimization, with numerous applications in machine learning, large-scale data processing, and federated learning. However, rigorously establishing their theoretical advantages over simpler methods, such as minibatch SGD (MbSGD), has proven challenging, as existing analyses often rely on strong assumptions, unrealistic premises, or overly restrictive scenarios.\r\n\r\nIn this work, we revisit the convergence properties of LocalSGD and SCAFFOLD under a variety of existing or weaker conditions, including gradient similarity, Hessian similarity, weak convexity, and Lipschitz continuity of the Hessian. Our analysis shows that (i) LocalSGD achieves faster convergence compared to MbSGD for weakly convex functions without requiring stronger gradient similarity assumptions; (ii) LocalSGD benefits significantly from higher-order similarity and smoothness; and (iii) SCAFFOLD demonstrates faster convergence than MbSGD for a broader class of non-quadratic functions. These theoretical insights provide a clearer understanding of the conditions under which LocalSGD and SCAFFOLD outperform MbSGD."}],"publisher":"ML Research Press","intvolume":" 258","volume":258,"publication":"The 28th International Conference on Artificial Intelligence and Statistics","ec_funded":1,"OA_place":"repository","oa_version":"Preprint","arxiv":1,"month":"05","conference":{"location":"Mai Khao, Thailand","start_date":"2025-05-03","end_date":"2025-05-05","name":"AISTATS: Conference on Artificial Intelligence and Statistics"},"acknowledgement":"The authors thank for the helpful discussions with Eduard Gorbunov, Kumar Kshitij Patel, Anton\r\nRodomanov, and Ali Zindari during the preparation of this work. This work was partially done during the first author’s stays at CISPA and at MBZUAI. The first author also acknowledges ERC CoG 863818 (ForM-SMArt) and Austrian Science Fund (FWF) 10.55776/COE12."}