Memorization and optimization in deep neural networks with minimum over-parameterization

Bombari S, Amani MH, Mondelli M. 2022. Memorization and optimization in deep neural networks with minimum over-parameterization. 36th Conference on Neural Information Processing Systems. NeurIPS: Neural Information Processing Systems, Advances in Neural Information Processing Systems, vol. 35, 7628–7640.

Download (ext.)
Conference Paper | Published | English
Author
Bombari, SimoneISTA; Amani, Mohammad Hossein; Mondelli, MarcoISTA

Corresponding author has ISTA affiliation

Department
Series Title
Advances in Neural Information Processing Systems
Abstract
The Neural Tangent Kernel (NTK) has emerged as a powerful tool to provide memorization, optimization and generalization guarantees in deep neural networks. A line of work has studied the NTK spectrum for two-layer and deep networks with at least a layer with Ω(N) neurons, N being the number of training samples. Furthermore, there is increasing evidence suggesting that deep networks with sub-linear layer widths are powerful memorizers and optimizers, as long as the number of parameters exceeds the number of samples. Thus, a natural open question is whether the NTK is well conditioned in such a challenging sub-linear setup. In this paper, we answer this question in the affirmative. Our key technical contribution is a lower bound on the smallest NTK eigenvalue for deep networks with the minimum possible over-parameterization: the number of parameters is roughly Ω(N) and, hence, the number of neurons is as little as Ω(N−−√). To showcase the applicability of our NTK bounds, we provide two results concerning memorization capacity and optimization guarantees for gradient descent training.
Publishing Year
Date Published
2022-07-24
Proceedings Title
36th Conference on Neural Information Processing Systems
Publisher
Neural Information Processing Systems Foundation
Acknowledgement
The authors were partially supported by the 2019 Lopez-Loreta prize, and they would like to thank Quynh Nguyen, Mahdi Soltanolkotabi and Adel Javanmard for helpful discussions.
Volume
35
Page
7628-7640
Conference
NeurIPS: Neural Information Processing Systems
Conference Location
New Orleans, LA, United States
Conference Date
2022-11-28 – 2022-12-09
eISSN
IST-REx-ID

Cite this

Bombari S, Amani MH, Mondelli M. Memorization and optimization in deep neural networks with minimum over-parameterization. In: 36th Conference on Neural Information Processing Systems. Vol 35. Neural Information Processing Systems Foundation; 2022:7628-7640.
Bombari, S., Amani, M. H., & Mondelli, M. (2022). Memorization and optimization in deep neural networks with minimum over-parameterization. In 36th Conference on Neural Information Processing Systems (Vol. 35, pp. 7628–7640). New Orleans, LA, United States: Neural Information Processing Systems Foundation.
Bombari, Simone, Mohammad Hossein Amani, and Marco Mondelli. “Memorization and Optimization in Deep Neural Networks with Minimum Over-Parameterization.” In 36th Conference on Neural Information Processing Systems, 35:7628–40. Neural Information Processing Systems Foundation, 2022.
S. Bombari, M. H. Amani, and M. Mondelli, “Memorization and optimization in deep neural networks with minimum over-parameterization,” in 36th Conference on Neural Information Processing Systems, New Orleans, LA, United States, 2022, vol. 35, pp. 7628–7640.
Bombari S, Amani MH, Mondelli M. 2022. Memorization and optimization in deep neural networks with minimum over-parameterization. 36th Conference on Neural Information Processing Systems. NeurIPS: Neural Information Processing Systems, Advances in Neural Information Processing Systems, vol. 35, 7628–7640.
Bombari, Simone, et al. “Memorization and Optimization in Deep Neural Networks with Minimum Over-Parameterization.” 36th Conference on Neural Information Processing Systems, vol. 35, Neural Information Processing Systems Foundation, 2022, pp. 7628–40.
All files available under the following license(s):
Copyright Statement:
This Item is protected by copyright and/or related rights. [...]

Link(s) to Main File(s)
Access Level
OA Open Access

Export

Marked Publications

Open Data ISTA Research Explorer

Sources

arXiv 2205.10217

Search this title in

Google Scholar
ISBN Search