<?xml version="1.0" encoding="UTF-8"?>

<modsCollection xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://www.loc.gov/mods/v3" xsi:schemaLocation="http://www.loc.gov/mods/v3 http://www.loc.gov/standards/mods/v3/mods-3-3.xsd">
<mods version="3.3">

<genre>conference paper</genre>

<titleInfo><title>AsGrad: A sharp unified analysis of asynchronous-SGD algorithms</title></titleInfo>

  
  
<titleInfo type="alternative">
  
  <title>PMLR</title>
</titleInfo>

<note type="publicationStatus">published</note>


<note type="qualityControlled">yes</note>

<name type="personal">
  <namePart type="given">Rustem</namePart>
  <namePart type="family">Islamov</namePart>
  <role><roleTerm type="text">author</roleTerm> </role></name>
<name type="personal">
  <namePart type="given">Mher</namePart>
  <namePart type="family">Safaryan</namePart>
  <role><roleTerm type="text">author</roleTerm> </role><identifier type="local">dd546b39-0804-11ed-9c55-ef075c39778d</identifier></name>
<name type="personal">
  <namePart type="given">Dan-Adrian</namePart>
  <namePart type="family">Alistarh</namePart>
  <role><roleTerm type="text">author</roleTerm> </role><identifier type="local">4A899BFC-F248-11E8-B48F-1D18A9856A87</identifier><description xsi:type="identifierDefinition" type="orcid">0000-0003-3650-940X</description></name>







<name type="corporate">
  <namePart></namePart>
  <identifier type="local">DaAl</identifier>
  <role>
    <roleTerm type="text">department</roleTerm>
  </role>
</name>



<name type="conference">
  <namePart>AISTATS: Conference on Artificial Intelligence and Statistics</namePart>
</name>



<name type="corporate">
  <namePart>IST-BRIDGE: International postdoctoral program</namePart>
  <role><roleTerm type="text">project</roleTerm></role>
</name>



<abstract lang="eng">We analyze asynchronous-type algorithms for distributed SGD in the heterogeneous setting, where each worker has its own computation and communication speeds, as well as data distribution. In these algorithms, workers compute possibly stale and stochastic gradients associated with their local data at some iteration back in history and then return those gradients to the server without synchronizing with other workers. We present a unified convergence theory for non-convex smooth functions in the heterogeneous regime. The proposed analysis provides convergence for pure asynchronous SGD and its various modifications. Moreover, our theory explains what affects the convergence rate and what can be done to improve the performance of asynchronous algorithms. In particular, we introduce a novel asynchronous method based on worker shuffling. As a by-product of our analysis, we also demonstrate convergence guarantees for gradient-type algorithms such as SGD with random reshuffling and shuffle-once mini-batch SGD. The derived rates match the best-known results for those algorithms, highlighting the tightness of our approach. Finally, our numerical evaluations support theoretical findings and show the good practical performance of our method.</abstract>

<originInfo><publisher>ML Research Press</publisher><dateIssued encoding="w3cdtf">2024</dateIssued><place><placeTerm type="text">Valencia, Spain</placeTerm></place>
</originInfo>
<language><languageTerm authority="iso639-2b" type="code">eng</languageTerm>
</language>



<relatedItem type="host"><titleInfo><title>Proceedings of The 27th International Conference on Artificial Intelligence and Statistics</title></titleInfo>
  <identifier type="eIssn">2640-3498</identifier>
  <identifier type="arXiv">2310.20452</identifier>
<part><detail type="volume"><number>238</number></detail><extent unit="pages">649-657</extent>
</part>
</relatedItem>


<extension>
<bibliographicCitation>
<apa>Islamov, R., Safaryan, M., &amp;#38; Alistarh, D.-A. (2024). AsGrad: A sharp unified analysis of asynchronous-SGD algorithms. In &lt;i&gt;Proceedings of The 27th International Conference on Artificial Intelligence and Statistics&lt;/i&gt; (Vol. 238, pp. 649–657). Valencia, Spain: ML Research Press.</apa>
<ama>Islamov R, Safaryan M, Alistarh D-A. AsGrad: A sharp unified analysis of asynchronous-SGD algorithms. In: &lt;i&gt;Proceedings of The 27th International Conference on Artificial Intelligence and Statistics&lt;/i&gt;. Vol 238. ML Research Press; 2024:649-657.</ama>
<short>R. Islamov, M. Safaryan, D.-A. Alistarh, in:, Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, ML Research Press, 2024, pp. 649–657.</short>
<ista>Islamov R, Safaryan M, Alistarh D-A. 2024. AsGrad: A sharp unified analysis of asynchronous-SGD algorithms. Proceedings of The 27th International Conference on Artificial Intelligence and Statistics. AISTATS: Conference on Artificial Intelligence and Statistics, PMLR, vol. 238, 649–657.</ista>
<ieee>R. Islamov, M. Safaryan, and D.-A. Alistarh, “AsGrad: A sharp unified analysis of asynchronous-SGD algorithms,” in &lt;i&gt;Proceedings of The 27th International Conference on Artificial Intelligence and Statistics&lt;/i&gt;, Valencia, Spain, 2024, vol. 238, pp. 649–657.</ieee>
<mla>Islamov, Rustem, et al. “AsGrad: A Sharp Unified Analysis of Asynchronous-SGD Algorithms.” &lt;i&gt;Proceedings of The 27th International Conference on Artificial Intelligence and Statistics&lt;/i&gt;, vol. 238, ML Research Press, 2024, pp. 649–57.</mla>
<chicago>Islamov, Rustem, Mher Safaryan, and Dan-Adrian Alistarh. “AsGrad: A Sharp Unified Analysis of Asynchronous-SGD Algorithms.” In &lt;i&gt;Proceedings of The 27th International Conference on Artificial Intelligence and Statistics&lt;/i&gt;, 238:649–57. ML Research Press, 2024.</chicago>
</bibliographicCitation>
</extension>
<recordInfo><recordIdentifier>18976</recordIdentifier><recordCreationDate encoding="w3cdtf">2025-01-30T08:15:49Z</recordCreationDate><recordChangeDate encoding="w3cdtf">2025-04-14T07:54:52Z</recordChangeDate>
</recordInfo>
</mods>
</modsCollection>
