Mean-field analysis for heavy ball methods: Dropout-stability, connectivity, and global convergence

Wu D, Kungurtsev V, Mondelli M. 2023. Mean-field analysis for heavy ball methods: Dropout-stability, connectivity, and global convergence. Transactions on Machine Learning Research. , TMLR, .

Download (ext.)
Conference Paper | Published | English
Author
Wu, DiyuanISTA; Kungurtsev, Vyacheslav; Mondelli, MarcoISTA

Corresponding author has ISTA affiliation

Department
Series Title
TMLR
Abstract
The stochastic heavy ball method (SHB), also known as stochastic gradient descent (SGD) with Polyak's momentum, is widely used in training neural networks. However, despite the remarkable success of such algorithm in practice, its theoretical characterization remains limited. In this paper, we focus on neural networks with two and three layers and provide a rigorous understanding of the properties of the solutions found by SHB: \emph{(i)} stability after dropping out part of the neurons, \emph{(ii)} connectivity along a low-loss path, and \emph{(iii)} convergence to the global optimum. To achieve this goal, we take a mean-field view and relate the SHB dynamics to a certain partial differential equation in the limit of large network widths. This mean-field perspective has inspired a recent line of work focusing on SGD while, in contrast, our paper considers an algorithm with momentum. More specifically, after proving existence and uniqueness of the limit differential equations, we show convergence to the global optimum and give a quantitative bound between the mean-field limit and the SHB dynamics of a finite-width network. Armed with this last bound, we are able to establish the dropout-stability and connectivity of SHB solutions.
Publishing Year
Date Published
2023-02-28
Proceedings Title
Transactions on Machine Learning Research
Publisher
ML Research Press
Acknowledgement
D. Wu and M. Mondelli are partially supported by the 2019 Lopez-Loreta Prize. V. Kungurtsev was supported by the OP VVV project CZ.02.1.01/0.0/0.0/16_019/0000765 "Research Center for Informatics".
IST-REx-ID

Cite this

Wu D, Kungurtsev V, Mondelli M. Mean-field analysis for heavy ball methods: Dropout-stability, connectivity, and global convergence. In: Transactions on Machine Learning Research. ML Research Press; 2023.
Wu, D., Kungurtsev, V., & Mondelli, M. (2023). Mean-field analysis for heavy ball methods: Dropout-stability, connectivity, and global convergence. In Transactions on Machine Learning Research. ML Research Press.
Wu, Diyuan, Vyacheslav Kungurtsev, and Marco Mondelli. “Mean-Field Analysis for Heavy Ball Methods: Dropout-Stability, Connectivity, and Global Convergence.” In Transactions on Machine Learning Research. ML Research Press, 2023.
D. Wu, V. Kungurtsev, and M. Mondelli, “Mean-field analysis for heavy ball methods: Dropout-stability, connectivity, and global convergence,” in Transactions on Machine Learning Research, 2023.
Wu D, Kungurtsev V, Mondelli M. 2023. Mean-field analysis for heavy ball methods: Dropout-stability, connectivity, and global convergence. Transactions on Machine Learning Research. , TMLR, .
Wu, Diyuan, et al. “Mean-Field Analysis for Heavy Ball Methods: Dropout-Stability, Connectivity, and Global Convergence.” Transactions on Machine Learning Research, ML Research Press, 2023.
All files available under the following license(s):
Creative Commons Attribution 4.0 International Public License (CC-BY 4.0):

Link(s) to Main File(s)
Access Level
OA Open Access

Export

Marked Publications

Open Data ISTA Research Explorer

Sources

arXiv 2210.06819

Search this title in

Google Scholar