TY - JOUR AB - In the Minimum Description Length (MDL) principle, learning from the data is equivalent to an optimal coding problem. We show that the codes that achieve optimal compression in MDL are critical in a very precise sense. First, when they are taken as generative models of samples, they generate samples with broad empirical distributions and with a high value of the relevance, defined as the entropy of the empirical frequencies. These results are derived for different statistical models (Dirichlet model, independent and pairwise dependent spin models, and restricted Boltzmann machines). Second, MDL codes sit precisely at a second order phase transition point where the symmetry between the sampled outcomes is spontaneously broken. The order parameter controlling the phase transition is the coding cost of the samples. The phase transition is a manifestation of the optimality of MDL codes, and it arises because codes that achieve a higher compression do not exist. These results suggest a clear interpretation of the widespread occurrence of statistical criticality as a characterization of samples which are maximally informative on the underlying generative process. AU - Cubero, Ryan J AU - Marsili, Matteo AU - Roudi, Yasser ID - 7126 IS - 10 JF - Entropy KW - Minimum Description Length KW - normalized maximum likelihood KW - statistical criticality KW - phase transitions KW - large deviations SN - 1099-4300 TI - Minimum description length codes are critical VL - 20 ER -