Underspecification in deep learning

Phuong M. 2021. Underspecification in deep learning. IST Austria.

OA mph-thesis-v519-pdfimages.pdf 2.67 MB

Thesis | PhD | Published | English
Series Title
IST Austria Thesis
Deep learning is best known for its empirical success across a wide range of applications spanning computer vision, natural language processing and speech. Of equal significance, though perhaps less known, are its ramifications for learning theory: deep networks have been observed to perform surprisingly well in the high-capacity regime, aka the overfitting or underspecified regime. Classically, this regime on the far right of the bias-variance curve is associated with poor generalisation; however, recent experiments with deep networks challenge this view. This thesis is devoted to investigating various aspects of underspecification in deep learning. First, we argue that deep learning models are underspecified on two levels: a) any given training dataset can be fit by many different functions, and b) any given function can be expressed by many different parameter configurations. We refer to the second kind of underspecification as parameterisation redundancy and we precisely characterise its extent. Second, we characterise the implicit criteria (the inductive bias) that guide learning in the underspecified regime. Specifically, we consider a nonlinear but tractable classification setting, and show that given the choice, neural networks learn classifiers with a large margin. Third, we consider learning scenarios where the inductive bias is not by itself sufficient to deal with underspecification. We then study different ways of ‘tightening the specification’: i) In the setting of representation learning with variational autoencoders, we propose a hand- crafted regulariser based on mutual information. ii) In the setting of binary classification, we consider soft-label (real-valued) supervision. We derive a generalisation bound for linear networks supervised in this way and verify that soft labels facilitate fast learning. Finally, we explore an application of soft-label supervision to the training of multi-exit models.
Publishing Year
Date Published

Cite this

Phuong M. Underspecification in deep learning. 2021. doi:10.15479/AT:ISTA:9418
Phuong, M. (2021). Underspecification in deep learning. IST Austria. https://doi.org/10.15479/AT:ISTA:9418
Phuong, Mary. “Underspecification in Deep Learning.” IST Austria, 2021. https://doi.org/10.15479/AT:ISTA:9418.
M. Phuong, “Underspecification in deep learning,” IST Austria, 2021.
Phuong M. 2021. Underspecification in deep learning. IST Austria.
Phuong, Mary. Underspecification in Deep Learning. IST Austria, 2021, doi:10.15479/AT:ISTA:9418.
All files available under the following license(s):
Copyright Statement:
This Item is protected by copyright and/or related rights. [...]
Main File(s)
Access Level
OA Open Access
Date Uploaded
MD5 Checksum

Source File
File Name
thesis.zip 93.00 MB
Access Level
Restricted Closed Access
Date Uploaded
MD5 Checksum


Marked Publications

Open Data ISTA Research Explorer

Search this title in

Google Scholar