Tight bounds on the smallest eigenvalue of the neural tangent kernel for deep ReLU networks

Nguyen Q, Mondelli M, Montufar GF. 2021. Tight bounds on the smallest eigenvalue of the neural tangent kernel for deep ReLU networks. Proceedings of the 38th International Conference on Machine Learning. ICML: International Conference on Machine Learning, Proceedings of Machine Learning Research, vol. 139, 8119–8129.

Conference Paper | Published | English
Author
Nguyen, Quynh; Mondelli, MarcoISTA ; Montufar, Guido F
Editor
Meila, Marina; Zhang, Tong
Department
Series Title
Proceedings of Machine Learning Research
Abstract
A recent line of work has analyzed the theoretical properties of deep neural networks via the Neural Tangent Kernel (NTK). In particular, the smallest eigenvalue of the NTK has been related to the memorization capacity, the global convergence of gradient descent algorithms and the generalization of deep nets. However, existing results either provide bounds in the two-layer setting or assume that the spectrum of the NTK matrices is bounded away from 0 for multi-layer networks. In this paper, we provide tight bounds on the smallest eigenvalue of NTK matrices for deep ReLU nets, both in the limiting case of infinite widths and for finite widths. In the finite-width setting, the network architectures we consider are fairly general: we require the existence of a wide layer with roughly order of $N$ neurons, $N$ being the number of data samples; and the scaling of the remaining layer widths is arbitrary (up to logarithmic factors). To obtain our results, we analyze various quantities of independent interest: we give lower bounds on the smallest singular value of hidden feature matrices, and upper bounds on the Lipschitz constant of input-output feature maps.
Publishing Year
Date Published
2021-01-01
Proceedings Title
Proceedings of the 38th International Conference on Machine Learning
Acknowledgement
The authors would like to thank the anonymous reviewers for their helpful comments. MM was partially supported by the 2019 Lopez-Loreta Prize. QN and GM acknowledge support from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (grant agreement no 757983).
Volume
139
Page
8119-8129
Conference
ICML: International Conference on Machine Learning
Conference Location
Virtual
Conference Date
2021-07-18 – 2021-07-24
IST-REx-ID

Cite this

Nguyen Q, Mondelli M, Montufar GF. Tight bounds on the smallest eigenvalue of the neural tangent kernel for deep ReLU networks. In: Meila M, Zhang T, eds. Proceedings of the 38th International Conference on Machine Learning. Vol 139. ML Research Press; 2021:8119-8129.
Nguyen, Q., Mondelli, M., & Montufar, G. F. (2021). Tight bounds on the smallest eigenvalue of the neural tangent kernel for deep ReLU networks. In M. Meila & T. Zhang (Eds.), Proceedings of the 38th International Conference on Machine Learning (Vol. 139, pp. 8119–8129). Virtual: ML Research Press.
Nguyen, Quynh, Marco Mondelli, and Guido F Montufar. “Tight Bounds on the Smallest Eigenvalue of the Neural Tangent Kernel for Deep ReLU Networks.” In Proceedings of the 38th International Conference on Machine Learning, edited by Marina Meila and Tong Zhang, 139:8119–29. ML Research Press, 2021.
Q. Nguyen, M. Mondelli, and G. F. Montufar, “Tight bounds on the smallest eigenvalue of the neural tangent kernel for deep ReLU networks,” in Proceedings of the 38th International Conference on Machine Learning, Virtual, 2021, vol. 139, pp. 8119–8129.
Nguyen Q, Mondelli M, Montufar GF. 2021. Tight bounds on the smallest eigenvalue of the neural tangent kernel for deep ReLU networks. Proceedings of the 38th International Conference on Machine Learning. ICML: International Conference on Machine Learning, Proceedings of Machine Learning Research, vol. 139, 8119–8129.
Nguyen, Quynh, et al. “Tight Bounds on the Smallest Eigenvalue of the Neural Tangent Kernel for Deep ReLU Networks.” Proceedings of the 38th International Conference on Machine Learning, edited by Marina Meila and Tong Zhang, vol. 139, ML Research Press, 2021, pp. 8119–29.
All files available under the following license(s):
Copyright Statement:
This Item is protected by copyright and/or related rights. [...]

Link(s) to Main File(s)
Access Level
OA Open Access

Export

Marked Publications

Open Data ISTA Research Explorer

Sources

arXiv 2012.11654

Search this title in

Google Scholar