7 Publications

Mark all

[7]
2025 | Published | Conference Paper | IST-REx-ID: 20035 | OA
A. Jacot, P. Súkeník, Z. Wang, and M. Mondelli, “Wide neural networks trained with weight decay provably exhibit neural collapse,” in 13th International Conference on Learning Representations, Singapore, Singapore, 2025, pp. 1905–1931.
[Published Version] View | Files available | arXiv
 
[6]
2024 | Epub ahead of print | Journal Article | IST-REx-ID: 12662 | OA
P. Súkeník and C. Lampert, “Generalization in multi-objective machine learning,” Neural Computing and Applications. Springer Nature, 2024.
[Published Version] View | DOI | Download Published Version (ext.) | arXiv
 
[5]
2024 | Published | Conference Paper | IST-REx-ID: 18890 | OA
D. Beaglehole, P. Súkeník, M. Mondelli, and M. Belkin, “Average gradient outer product as a mechanism for deep neural collapse,” in 38th Annual Conference on Neural Information Processing Systems, Vancouver, Canada, 2024, vol. 37.
[Preprint] View | Download Preprint (ext.) | arXiv
 
[4]
2024 | Published | Conference Paper | IST-REx-ID: 18891 | OA
P. Súkeník, C. Lampert, and M. Mondelli, “Neural collapse versus low-rank bias: Is deep neural collapse really optimal?,” in 38th Annual Conference on Neural Information Processing Systems, Vancouver, Canada, 2024, vol. 37.
[Published Version] View | Files available | arXiv
 
[3]
2023 | Published | Conference Paper | IST-REx-ID: 14921 | OA
P. Súkeník, M. Mondelli, and C. Lampert, “Deep neural collapse is provably optimal for the deep unconstrained features model,” in 37th Annual Conference on Neural Information Processing Systems, New Orleans, LA, United States, 2023.
[Preprint] View | Download Preprint (ext.) | arXiv
 
[2]
2022 | Published | Conference Paper | IST-REx-ID: 12664 | OA
P. Súkeník, A. Kuvshinov, and S. Günnemann, “Intriguing properties of input-dependent randomized smoothing,” in Proceedings of the 39th International Conference on Machine Learning, Baltimore, MD, United States, 2022, vol. 162, pp. 20697–20743.
[Published Version] View | Files available | arXiv
 
[1]
2022 | Published | Conference Paper | IST-REx-ID: 18876 | OA
P. Kocsis, P. Súkeník, G. Brasó, M. Niessner, L. Leal-Taixé, and I. Elezi, “The unreasonable effectiveness of fully-connected layers for low-data regimes,” in 36th Conference on Neural Information Processing Systems, New Orleans, LA, United States, 2022, vol. 35, pp. 1896–1908.
[Published Version] View | Files available | arXiv
 

Search

Filter Publications

Display / Sort

Citation Style: IEEE

Export / Embed

Grants


7 Publications

Mark all

[7]
2025 | Published | Conference Paper | IST-REx-ID: 20035 | OA
A. Jacot, P. Súkeník, Z. Wang, and M. Mondelli, “Wide neural networks trained with weight decay provably exhibit neural collapse,” in 13th International Conference on Learning Representations, Singapore, Singapore, 2025, pp. 1905–1931.
[Published Version] View | Files available | arXiv
 
[6]
2024 | Epub ahead of print | Journal Article | IST-REx-ID: 12662 | OA
P. Súkeník and C. Lampert, “Generalization in multi-objective machine learning,” Neural Computing and Applications. Springer Nature, 2024.
[Published Version] View | DOI | Download Published Version (ext.) | arXiv
 
[5]
2024 | Published | Conference Paper | IST-REx-ID: 18890 | OA
D. Beaglehole, P. Súkeník, M. Mondelli, and M. Belkin, “Average gradient outer product as a mechanism for deep neural collapse,” in 38th Annual Conference on Neural Information Processing Systems, Vancouver, Canada, 2024, vol. 37.
[Preprint] View | Download Preprint (ext.) | arXiv
 
[4]
2024 | Published | Conference Paper | IST-REx-ID: 18891 | OA
P. Súkeník, C. Lampert, and M. Mondelli, “Neural collapse versus low-rank bias: Is deep neural collapse really optimal?,” in 38th Annual Conference on Neural Information Processing Systems, Vancouver, Canada, 2024, vol. 37.
[Published Version] View | Files available | arXiv
 
[3]
2023 | Published | Conference Paper | IST-REx-ID: 14921 | OA
P. Súkeník, M. Mondelli, and C. Lampert, “Deep neural collapse is provably optimal for the deep unconstrained features model,” in 37th Annual Conference on Neural Information Processing Systems, New Orleans, LA, United States, 2023.
[Preprint] View | Download Preprint (ext.) | arXiv
 
[2]
2022 | Published | Conference Paper | IST-REx-ID: 12664 | OA
P. Súkeník, A. Kuvshinov, and S. Günnemann, “Intriguing properties of input-dependent randomized smoothing,” in Proceedings of the 39th International Conference on Machine Learning, Baltimore, MD, United States, 2022, vol. 162, pp. 20697–20743.
[Published Version] View | Files available | arXiv
 
[1]
2022 | Published | Conference Paper | IST-REx-ID: 18876 | OA
P. Kocsis, P. Súkeník, G. Brasó, M. Niessner, L. Leal-Taixé, and I. Elezi, “The unreasonable effectiveness of fully-connected layers for low-data regimes,” in 36th Conference on Neural Information Processing Systems, New Orleans, LA, United States, 2022, vol. 35, pp. 1896–1908.
[Published Version] View | Files available | arXiv
 

Search

Filter Publications

Display / Sort

Citation Style: IEEE

Export / Embed