{"issue":"10","article_processing_charge":"No","author":[{"orcid":"0000-0003-0002-1867","id":"850B2E12-9CD4-11E9-837F-E719E6697425","first_name":"Ryan J","full_name":"Cubero, Ryan J","last_name":"Cubero"},{"first_name":"Matteo","full_name":"Marsili, Matteo","last_name":"Marsili"},{"full_name":"Roudi, Yasser","first_name":"Yasser","last_name":"Roudi"}],"intvolume":" 20","oa":1,"oa_version":"Published Version","user_id":"2DF688A6-F248-11E8-B48F-1D18A9856A87","doi":"10.3390/e20100755","date_updated":"2021-01-12T08:11:56Z","publication_identifier":{"issn":["1099-4300"]},"abstract":[{"lang":"eng","text":"In the Minimum Description Length (MDL) principle, learning from the data is equivalent to an optimal coding problem. We show that the codes that achieve optimal compression in MDL are critical in a very precise sense. First, when they are taken as generative models of samples, they generate samples with broad empirical distributions and with a high value of the relevance, defined as the entropy of the empirical frequencies. These results are derived for different statistical models (Dirichlet model, independent and pairwise dependent spin models, and restricted Boltzmann machines). Second, MDL codes sit precisely at a second order phase transition point where the symmetry between the sampled outcomes is spontaneously broken. The order parameter controlling the phase transition is the coding cost of the samples. The phase transition is a manifestation of the optimality of MDL codes, and it arises because codes that achieve a higher compression do not exist. These results suggest a clear interpretation of the widespread occurrence of statistical criticality as a characterization of samples which are maximally informative on the underlying generative process."}],"date_published":"2018-10-01T00:00:00Z","_id":"7126","publication":"Entropy","status":"public","article_type":"original","volume":20,"day":"01","type":"journal_article","publisher":"MDPI","language":[{"iso":"eng"}],"month":"10","publication_status":"published","quality_controlled":"1","file":[{"file_size":1366813,"file_id":"7127","content_type":"application/pdf","date_created":"2019-11-26T22:23:08Z","access_level":"open_access","date_updated":"2020-07-14T12:47:50Z","relation":"main_file","checksum":"d642b7b661e1d5066b62e6ea9986b917","creator":"rcubero","file_name":"entropy-20-00755-v2.pdf"}],"ddc":["519"],"title":"Minimum description length codes are critical","article_number":"755","date_created":"2019-11-26T22:18:05Z","has_accepted_license":"1","year":"2018","extern":"1","citation":{"chicago":"Cubero, Ryan J, Matteo Marsili, and Yasser Roudi. “Minimum Description Length Codes Are Critical.” Entropy. MDPI, 2018. https://doi.org/10.3390/e20100755.","ista":"Cubero RJ, Marsili M, Roudi Y. 2018. Minimum description length codes are critical. Entropy. 20(10), 755.","apa":"Cubero, R. J., Marsili, M., & Roudi, Y. (2018). Minimum description length codes are critical. Entropy. MDPI. https://doi.org/10.3390/e20100755","ieee":"R. J. Cubero, M. Marsili, and Y. Roudi, “Minimum description length codes are critical,” Entropy, vol. 20, no. 10. MDPI, 2018.","mla":"Cubero, Ryan J., et al. “Minimum Description Length Codes Are Critical.” Entropy, vol. 20, no. 10, 755, MDPI, 2018, doi:10.3390/e20100755.","ama":"Cubero RJ, Marsili M, Roudi Y. Minimum description length codes are critical. Entropy. 2018;20(10). doi:10.3390/e20100755","short":"R.J. Cubero, M. Marsili, Y. Roudi, Entropy 20 (2018)."},"keyword":["Minimum Description Length","normalized maximum likelihood","statistical criticality","phase transitions","large deviations"],"file_date_updated":"2020-07-14T12:47:50Z","tmp":{"legal_code_url":"https://creativecommons.org/licenses/by/4.0/legalcode","short":"CC BY (4.0)","image":"/images/cc_by.png","name":"Creative Commons Attribution 4.0 International Public License (CC-BY 4.0)"}}