Neural collapse beyond the unconstrained features model: Landscape, dynamics, and generalization in the mean-field regime
Wu D, Mondelli M. 2025. Neural collapse beyond the unconstrained features model: Landscape, dynamics, and generalization in the mean-field regime. Proceedings of the 42nd International Conference on Machine Learning. ICML: International Conference on Machine Learning, PMLR, vol. 267, 67499–67536.
Download
Conference Paper
| Published
| English
Author
Corresponding author has ISTA affiliation
Department
Series Title
PMLR
Abstract
Neural Collapse is a phenomenon where the last-layer representations of a well-trained neural network converge to a highly structured geometry. In this paper, we focus on its first (and most basic) property, known as NC1: the within-class variability vanishes. While prior theoretical studies establish the occurrence of NC1 via the data-agnostic unconstrained features model, our work adopts a data-specific perspective, analyzing NC1 in a three-layer neural network, with the first two layers operating in the mean-field regime and followed by a linear layer. In particular, we establish a fundamental connection between NC1 and the loss landscape: we prove that points with small empirical loss and gradient norm (thus, close to being stationary) approximately satisfy NC1, and the closeness to NC1 is controlled by the residual loss and gradient norm. We then show that (i) gradient flow on the mean squared error converges to NC1 solutions with small empirical loss, and (ii) for well-separated data distributions, both NC1 and vanishing test loss are achieved simultaneously. This aligns with the empirical observation that NC1 emerges during training while models attain near-zero test error. Overall, our results demonstrate that NC1 arises from gradient training due to the properties of the loss landscape, and they show the co-occurrence of NC1 and small test error for certain data distributions.
Publishing Year
Date Published
2025-07-30
Proceedings Title
Proceedings of the 42nd International Conference on Machine Learning
Publisher
ML Research Press
Acknowledgement
This research was funded in whole or in part by the Austrian Science Fund (FWF) 10.55776/COE12. For the purpose of open access, the authors have applied a CC BY public
copyright license to any Author Accepted Manuscript version arising from this submission. The authors would like to thank Peter Sukenık for general helpful discussions and for pointing out that all the stationary points are approximately proportional in the case without entropic regularization.
Volume
267
Page
67499-67536
Conference
ICML: International Conference on Machine Learning
Conference Location
Vancouver, Canada
Conference Date
2025-07-13 – 2025-07-19
eISSN
IST-REx-ID
Cite this
Wu D, Mondelli M. Neural collapse beyond the unconstrained features model: Landscape, dynamics, and generalization in the mean-field regime. In: Proceedings of the 42nd International Conference on Machine Learning. Vol 267. ML Research Press; 2025:67499-67536.
Wu, D., & Mondelli, M. (2025). Neural collapse beyond the unconstrained features model: Landscape, dynamics, and generalization in the mean-field regime. In Proceedings of the 42nd International Conference on Machine Learning (Vol. 267, pp. 67499–67536). Vancouver, Canada: ML Research Press.
Wu, Diyuan, and Marco Mondelli. “Neural Collapse beyond the Unconstrained Features Model: Landscape, Dynamics, and Generalization in the Mean-Field Regime.” In Proceedings of the 42nd International Conference on Machine Learning, 267:67499–536. ML Research Press, 2025.
D. Wu and M. Mondelli, “Neural collapse beyond the unconstrained features model: Landscape, dynamics, and generalization in the mean-field regime,” in Proceedings of the 42nd International Conference on Machine Learning, Vancouver, Canada, 2025, vol. 267, pp. 67499–67536.
Wu D, Mondelli M. 2025. Neural collapse beyond the unconstrained features model: Landscape, dynamics, and generalization in the mean-field regime. Proceedings of the 42nd International Conference on Machine Learning. ICML: International Conference on Machine Learning, PMLR, vol. 267, 67499–67536.
Wu, Diyuan, and Marco Mondelli. “Neural Collapse beyond the Unconstrained Features Model: Landscape, Dynamics, and Generalization in the Mean-Field Regime.” Proceedings of the 42nd International Conference on Machine Learning, vol. 267, ML Research Press, 2025, pp. 67499–536.
All files available under the following license(s):
Creative Commons Attribution 4.0 International Public License (CC-BY 4.0):
Main File(s)
File Name
2025_ICML_Wu.pdf
3.99 MB
Access Level
Open Access
Date Uploaded
2026-02-19
MD5 Checksum
c5ce8b1c83e33dc3a11122f4910deb67
Export
Marked PublicationsOpen Data ISTA Research Explorer
Sources
arXiv 2501.19104
