Peter Súkeník
7 Publications
2025 | Published | Conference Paper | IST-REx-ID: 20035 |
Jacot, Arthur, Peter Súkeník, Zihan Wang, and Marco Mondelli. “Wide Neural Networks Trained with Weight Decay Provably Exhibit Neural Collapse.” In 13th International Conference on Learning Representations, 1905–31. ICLR, 2025.
[Published Version]
View
| Files available
| arXiv
2024 | Epub ahead of print | Journal Article | IST-REx-ID: 12662 |
Súkeník, Peter, and Christoph Lampert. “Generalization in Multi-Objective Machine Learning.” Neural Computing and Applications. Springer Nature, 2024. https://doi.org/10.1007/s00521-024-10616-1.
[Published Version]
View
| DOI
| Download Published Version (ext.)
| arXiv
2024 | Published | Conference Paper | IST-REx-ID: 18890 |
Beaglehole, Daniel, Peter Súkeník, Marco Mondelli, and Mikhail Belkin. “Average Gradient Outer Product as a Mechanism for Deep Neural Collapse.” In 38th Annual Conference on Neural Information Processing Systems, Vol. 37. Neural Information Processing Systems Foundation, 2024.
[Preprint]
View
| Download Preprint (ext.)
| arXiv
2024 | Published | Conference Paper | IST-REx-ID: 18891 |
Súkeník, Peter, Christoph Lampert, and Marco Mondelli. “Neural Collapse versus Low-Rank Bias: Is Deep Neural Collapse Really Optimal?” In 38th Annual Conference on Neural Information Processing Systems, Vol. 37. Neural Information Processing Systems Foundation, 2024.
[Published Version]
View
| Files available
| arXiv
2023 | Published | Conference Paper | IST-REx-ID: 14921 |
Súkeník, Peter, Marco Mondelli, and Christoph Lampert. “Deep Neural Collapse Is Provably Optimal for the Deep Unconstrained Features Model.” In 37th Annual Conference on Neural Information Processing Systems, 2023.
[Preprint]
View
| Download Preprint (ext.)
| arXiv
2022 | Published | Conference Paper | IST-REx-ID: 12664 |
Súkeník, Peter, Aleksei Kuvshinov, and Stephan Günnemann. “Intriguing Properties of Input-Dependent Randomized Smoothing.” In Proceedings of the 39th International Conference on Machine Learning, 162:20697–743. ML Research Press, 2022.
[Published Version]
View
| Files available
| arXiv
2022 | Published | Conference Paper | IST-REx-ID: 18876 |
Kocsis, Peter, Peter Súkeník, Guillem Brasó, Matthias Niessner, Laura Leal-Taixé, and Ismail Elezi. “The Unreasonable Effectiveness of Fully-Connected Layers for Low-Data Regimes.” In 36th Conference on Neural Information Processing Systems, 35:1896–1908. Neural Information Processing Systems Foundation, 2022.
[Published Version]
View
| Files available
| arXiv
Search
Filter Publications
Display / Sort
Export / Embed
Grants
7 Publications
2025 | Published | Conference Paper | IST-REx-ID: 20035 |
Jacot, Arthur, Peter Súkeník, Zihan Wang, and Marco Mondelli. “Wide Neural Networks Trained with Weight Decay Provably Exhibit Neural Collapse.” In 13th International Conference on Learning Representations, 1905–31. ICLR, 2025.
[Published Version]
View
| Files available
| arXiv
2024 | Epub ahead of print | Journal Article | IST-REx-ID: 12662 |
Súkeník, Peter, and Christoph Lampert. “Generalization in Multi-Objective Machine Learning.” Neural Computing and Applications. Springer Nature, 2024. https://doi.org/10.1007/s00521-024-10616-1.
[Published Version]
View
| DOI
| Download Published Version (ext.)
| arXiv
2024 | Published | Conference Paper | IST-REx-ID: 18890 |
Beaglehole, Daniel, Peter Súkeník, Marco Mondelli, and Mikhail Belkin. “Average Gradient Outer Product as a Mechanism for Deep Neural Collapse.” In 38th Annual Conference on Neural Information Processing Systems, Vol. 37. Neural Information Processing Systems Foundation, 2024.
[Preprint]
View
| Download Preprint (ext.)
| arXiv
2024 | Published | Conference Paper | IST-REx-ID: 18891 |
Súkeník, Peter, Christoph Lampert, and Marco Mondelli. “Neural Collapse versus Low-Rank Bias: Is Deep Neural Collapse Really Optimal?” In 38th Annual Conference on Neural Information Processing Systems, Vol. 37. Neural Information Processing Systems Foundation, 2024.
[Published Version]
View
| Files available
| arXiv
2023 | Published | Conference Paper | IST-REx-ID: 14921 |
Súkeník, Peter, Marco Mondelli, and Christoph Lampert. “Deep Neural Collapse Is Provably Optimal for the Deep Unconstrained Features Model.” In 37th Annual Conference on Neural Information Processing Systems, 2023.
[Preprint]
View
| Download Preprint (ext.)
| arXiv
2022 | Published | Conference Paper | IST-REx-ID: 12664 |
Súkeník, Peter, Aleksei Kuvshinov, and Stephan Günnemann. “Intriguing Properties of Input-Dependent Randomized Smoothing.” In Proceedings of the 39th International Conference on Machine Learning, 162:20697–743. ML Research Press, 2022.
[Published Version]
View
| Files available
| arXiv
2022 | Published | Conference Paper | IST-REx-ID: 18876 |
Kocsis, Peter, Peter Súkeník, Guillem Brasó, Matthias Niessner, Laura Leal-Taixé, and Ismail Elezi. “The Unreasonable Effectiveness of Fully-Connected Layers for Low-Data Regimes.” In 36th Conference on Neural Information Processing Systems, 35:1896–1908. Neural Information Processing Systems Foundation, 2022.
[Published Version]
View
| Files available
| arXiv