On the connection between learning two-layers neural networks and tensor decomposition

Mondelli M, Montanari A. 2019. On the connection between learning two-layers neural networks and tensor  decomposition. Proceedings of the 22nd International Conference on Artificial Intelligence and Statistics. AISTATS: Artificial Intelligence and Statistics vol. 89, 1051–1060.

Download (ext.)
Conference Paper | Published | English
Author
Mondelli, MarcoISTA ; Montanari, Andrea
Abstract
We establish connections between the problem of learning a two-layer neural network and tensor decomposition. We consider a model with feature vectors x∈ℝd, r hidden units with weights {wi}1≤i≤r and output y∈ℝ, i.e., y=∑ri=1σ(w𝖳ix), with activation functions given by low-degree polynomials. In particular, if σ(x)=a0+a1x+a3x3, we prove that no polynomial-time learning algorithm can outperform the trivial predictor that assigns to each example the response variable 𝔼(y), when d3/2≪r≪d2. Our conclusion holds for a `natural data distribution', namely standard Gaussian feature vectors x, and output distributed according to a two-layer neural network with random isotropic weights, and under a certain complexity-theoretic assumption on tensor decomposition. Roughly speaking, we assume that no polynomial-time algorithm can substantially outperform current methods for tensor decomposition based on the sum-of-squares hierarchy. We also prove generalizations of this statement for higher degree polynomial activations, and non-random weight vectors. Remarkably, several existing algorithms for learning two-layer networks with rigorous guarantees are based on tensor decomposition. Our results support the idea that this is indeed the core computational difficulty in learning such networks, under the stated generative model for the data. As a side result, we show that under this model learning the network requires accurate learning of its weights, a property that does not hold in a more general setting.
Publishing Year
Date Published
2019-04-01
Proceedings Title
Proceedings of the 22nd International Conference on Artificial Intelligence and Statistics
Publisher
Proceedings of Machine Learning Research
Volume
89
Page
1051-1060
Conference
AISTATS: Artificial Intelligence and Statistics
Conference Location
Naha, Okinawa, Japan
Conference Date
2019-04-16 – 2019-04-18
IST-REx-ID

Cite this

Mondelli M, Montanari A. On the connection between learning two-layers neural networks and tensor  decomposition. In: Proceedings of the 22nd International Conference on Artificial Intelligence and Statistics. Vol 89. Proceedings of Machine Learning Research; 2019:1051-1060.
Mondelli, M., & Montanari, A. (2019). On the connection between learning two-layers neural networks and tensor  decomposition. In Proceedings of the 22nd International Conference on Artificial Intelligence and Statistics (Vol. 89, pp. 1051–1060). Naha, Okinawa, Japan: Proceedings of Machine Learning Research.
Mondelli, Marco, and Andrea Montanari. “On the Connection between Learning Two-Layers Neural Networks and Tensor  Decomposition.” In Proceedings of the 22nd International Conference on Artificial Intelligence and Statistics, 89:1051–60. Proceedings of Machine Learning Research, 2019.
M. Mondelli and A. Montanari, “On the connection between learning two-layers neural networks and tensor  decomposition,” in Proceedings of the 22nd International Conference on Artificial Intelligence and Statistics, Naha, Okinawa, Japan, 2019, vol. 89, pp. 1051–1060.
Mondelli M, Montanari A. 2019. On the connection between learning two-layers neural networks and tensor  decomposition. Proceedings of the 22nd International Conference on Artificial Intelligence and Statistics. AISTATS: Artificial Intelligence and Statistics vol. 89, 1051–1060.
Mondelli, Marco, and Andrea Montanari. “On the Connection between Learning Two-Layers Neural Networks and Tensor  Decomposition.” Proceedings of the 22nd International Conference on Artificial Intelligence and Statistics, vol. 89, Proceedings of Machine Learning Research, 2019, pp. 1051–60.
All files available under the following license(s):
Copyright Statement:
This Item is protected by copyright and/or related rights. [...]

Link(s) to Main File(s)
Access Level
OA Open Access

Export

Marked Publications

Open Data ISTA Research Explorer

Sources

arXiv 1802.07301

Search this title in

Google Scholar