Peter Súkeník
7 Publications
2025 | Published | Conference Paper | IST-REx-ID: 20035 |
Jacot A, Súkeník P, Wang Z, Mondelli M. Wide neural networks trained with weight decay provably exhibit neural collapse. In: 13th International Conference on Learning Representations. ICLR; 2025:1905-1931.
[Published Version]
View
| Files available
| arXiv
2024 | Epub ahead of print | Journal Article | IST-REx-ID: 12662 |
Súkeník P, Lampert C. Generalization in multi-objective machine learning. Neural Computing and Applications. 2024. doi:10.1007/s00521-024-10616-1
[Published Version]
View
| DOI
| Download Published Version (ext.)
| arXiv
2024 | Published | Conference Paper | IST-REx-ID: 18890 |
Beaglehole D, Súkeník P, Mondelli M, Belkin M. Average gradient outer product as a mechanism for deep neural collapse. In: 38th Annual Conference on Neural Information Processing Systems. Vol 37. Neural Information Processing Systems Foundation; 2024.
[Preprint]
View
| Download Preprint (ext.)
| arXiv
2024 | Published | Conference Paper | IST-REx-ID: 18891 |
Súkeník P, Lampert C, Mondelli M. Neural collapse versus low-rank bias: Is deep neural collapse really optimal? In: 38th Annual Conference on Neural Information Processing Systems. Vol 37. Neural Information Processing Systems Foundation; 2024.
[Published Version]
View
| Files available
| arXiv
2023 | Published | Conference Paper | IST-REx-ID: 14921 |
Súkeník P, Mondelli M, Lampert C. Deep neural collapse is provably optimal for the deep unconstrained features model. In: 37th Annual Conference on Neural Information Processing Systems. ; 2023.
[Preprint]
View
| Download Preprint (ext.)
| arXiv
2022 | Published | Conference Paper | IST-REx-ID: 12664 |
Súkeník P, Kuvshinov A, Günnemann S. Intriguing properties of input-dependent randomized smoothing. In: Proceedings of the 39th International Conference on Machine Learning. Vol 162. ML Research Press; 2022:20697-20743.
[Published Version]
View
| Files available
| arXiv
2022 | Published | Conference Paper | IST-REx-ID: 18876 |
Kocsis P, Súkeník P, Brasó G, Niessner M, Leal-Taixé L, Elezi I. The unreasonable effectiveness of fully-connected layers for low-data regimes. In: 36th Conference on Neural Information Processing Systems. Vol 35. Neural Information Processing Systems Foundation; 2022:1896-1908.
[Published Version]
View
| Files available
| arXiv
Grants
7 Publications
2025 | Published | Conference Paper | IST-REx-ID: 20035 |
Jacot A, Súkeník P, Wang Z, Mondelli M. Wide neural networks trained with weight decay provably exhibit neural collapse. In: 13th International Conference on Learning Representations. ICLR; 2025:1905-1931.
[Published Version]
View
| Files available
| arXiv
2024 | Epub ahead of print | Journal Article | IST-REx-ID: 12662 |
Súkeník P, Lampert C. Generalization in multi-objective machine learning. Neural Computing and Applications. 2024. doi:10.1007/s00521-024-10616-1
[Published Version]
View
| DOI
| Download Published Version (ext.)
| arXiv
2024 | Published | Conference Paper | IST-REx-ID: 18890 |
Beaglehole D, Súkeník P, Mondelli M, Belkin M. Average gradient outer product as a mechanism for deep neural collapse. In: 38th Annual Conference on Neural Information Processing Systems. Vol 37. Neural Information Processing Systems Foundation; 2024.
[Preprint]
View
| Download Preprint (ext.)
| arXiv
2024 | Published | Conference Paper | IST-REx-ID: 18891 |
Súkeník P, Lampert C, Mondelli M. Neural collapse versus low-rank bias: Is deep neural collapse really optimal? In: 38th Annual Conference on Neural Information Processing Systems. Vol 37. Neural Information Processing Systems Foundation; 2024.
[Published Version]
View
| Files available
| arXiv
2023 | Published | Conference Paper | IST-REx-ID: 14921 |
Súkeník P, Mondelli M, Lampert C. Deep neural collapse is provably optimal for the deep unconstrained features model. In: 37th Annual Conference on Neural Information Processing Systems. ; 2023.
[Preprint]
View
| Download Preprint (ext.)
| arXiv
2022 | Published | Conference Paper | IST-REx-ID: 12664 |
Súkeník P, Kuvshinov A, Günnemann S. Intriguing properties of input-dependent randomized smoothing. In: Proceedings of the 39th International Conference on Machine Learning. Vol 162. ML Research Press; 2022:20697-20743.
[Published Version]
View
| Files available
| arXiv
2022 | Published | Conference Paper | IST-REx-ID: 18876 |
Kocsis P, Súkeník P, Brasó G, Niessner M, Leal-Taixé L, Elezi I. The unreasonable effectiveness of fully-connected layers for low-data regimes. In: 36th Conference on Neural Information Processing Systems. Vol 35. Neural Information Processing Systems Foundation; 2022:1896-1908.
[Published Version]
View
| Files available
| arXiv