Neural collapse versus low-rank bias: Is deep neural collapse really optimal?

Súkeník P, Lampert C, Mondelli M. 2024. Neural collapse versus low-rank bias: Is deep neural collapse really optimal? 38th Annual Conference on Neural Information Processing Systems. NeurIPS: Neural Information Processing Systems, Advances in Neural Information Processing Systems, vol. 37.

Download
OA 2024_NeurIPS_Sukenik.pdf 1.78 MB [Published Version]
Conference Paper | Published | English

Corresponding author has ISTA affiliation

Series Title
Advances in Neural Information Processing Systems
Abstract
Deep neural networks (DNNs) exhibit a surprising structure in their final layer known as neural collapse (NC), and a growing body of works has currently investigated the propagation of neural collapse to earlier layers of DNNs – a phenomenon called deep neural collapse (DNC). However, existing theoretical results are restricted to special cases: linear models, only two layers or binary classification. In contrast, we focus on non-linear models of arbitrary depth in multi-class classification and reveal a surprising qualitative shift. As soon as we go beyond two layers or two classes, DNC stops being optimal for the deep unconstrained features model (DUFM) – the standard theoretical framework for the analysis of collapse. The main culprit is a low-rank bias of multi-layer regularization schemes: this bias leads to optimal solutions of even lower rank than the neural collapse. We support our theoretical findings with experiments on both DUFM and real data, which show the emergence of the low-rank structure in the solution found by gradient descent.
Publishing Year
Date Published
2024-12-01
Proceedings Title
38th Annual Conference on Neural Information Processing Systems
Publisher
Neural Information Processing Systems Foundation
Acknowledgement
Marco Mondelli is partially supported by the 2019 Lopez-Loreta prize. This research was supported by the Scientific Service Units (SSU) of ISTA through resources provided by Scientific Computing (SciComp).
Acknowledged SSUs
Volume
37
Conference
NeurIPS: Neural Information Processing Systems
Conference Location
Vancouver, Canada
Conference Date
2024-12-16 – 2024-12-16
IST-REx-ID

Cite this

Súkeník P, Lampert C, Mondelli M. Neural collapse versus low-rank bias: Is deep neural collapse really optimal? In: 38th Annual Conference on Neural Information Processing Systems. Vol 37. Neural Information Processing Systems Foundation; 2024.
Súkeník, P., Lampert, C., & Mondelli, M. (2024). Neural collapse versus low-rank bias: Is deep neural collapse really optimal? In 38th Annual Conference on Neural Information Processing Systems (Vol. 37). Vancouver, Canada: Neural Information Processing Systems Foundation.
Súkeník, Peter, Christoph Lampert, and Marco Mondelli. “Neural Collapse versus Low-Rank Bias: Is Deep Neural Collapse Really Optimal?” In 38th Annual Conference on Neural Information Processing Systems, Vol. 37. Neural Information Processing Systems Foundation, 2024.
P. Súkeník, C. Lampert, and M. Mondelli, “Neural collapse versus low-rank bias: Is deep neural collapse really optimal?,” in 38th Annual Conference on Neural Information Processing Systems, Vancouver, Canada, 2024, vol. 37.
Súkeník P, Lampert C, Mondelli M. 2024. Neural collapse versus low-rank bias: Is deep neural collapse really optimal? 38th Annual Conference on Neural Information Processing Systems. NeurIPS: Neural Information Processing Systems, Advances in Neural Information Processing Systems, vol. 37.
Súkeník, Peter, et al. “Neural Collapse versus Low-Rank Bias: Is Deep Neural Collapse Really Optimal?” 38th Annual Conference on Neural Information Processing Systems, vol. 37, Neural Information Processing Systems Foundation, 2024.
All files available under the following license(s):
Creative Commons Attribution 4.0 International Public License (CC-BY 4.0):
Main File(s)
File Name
Access Level
OA Open Access
Date Uploaded
2025-02-04
MD5 Checksum
b7b79f1ea3ac1e9e11b3d91faaeb0780


Export

0 Marked Publications

Open Data ISTA Research Explorer

Search this title in

Google Scholar