{"year":"2025","article_number":"854","pmid":1,"DOAJ_listed":"1","date_created":"2025-09-07T22:01:33Z","date_updated":"2025-09-30T14:32:31Z","department":[{"_id":"HeEd"}],"doi":"10.3390/e27080854","ddc":["500"],"date_published":"2025-08-01T00:00:00Z","citation":{"ista":"Akopyan A, Edelsbrunner H, Virk Z, Wagner H. 2025. Tight bounds between the Jensen–Shannon divergence and the minmax divergence. Entropy. 27(8), 854.","short":"A. Akopyan, H. Edelsbrunner, Z. Virk, H. Wagner, Entropy 27 (2025).","mla":"Akopyan, Arseniy, et al. “Tight Bounds between the Jensen–Shannon Divergence and the Minmax Divergence.” Entropy, vol. 27, no. 8, 854, MDPI, 2025, doi:10.3390/e27080854.","ama":"Akopyan A, Edelsbrunner H, Virk Z, Wagner H. Tight bounds between the Jensen–Shannon divergence and the minmax divergence. Entropy. 2025;27(8). doi:10.3390/e27080854","apa":"Akopyan, A., Edelsbrunner, H., Virk, Z., & Wagner, H. (2025). Tight bounds between the Jensen–Shannon divergence and the minmax divergence. Entropy. MDPI. https://doi.org/10.3390/e27080854","chicago":"Akopyan, Arseniy, Herbert Edelsbrunner, Ziga Virk, and Hubert Wagner. “Tight Bounds between the Jensen–Shannon Divergence and the Minmax Divergence.” Entropy. MDPI, 2025. https://doi.org/10.3390/e27080854.","ieee":"A. Akopyan, H. Edelsbrunner, Z. Virk, and H. Wagner, “Tight bounds between the Jensen–Shannon divergence and the minmax divergence,” Entropy, vol. 27, no. 8. MDPI, 2025."},"publication_identifier":{"eissn":["1099-4300"]},"day":"01","project":[{"_id":"266A2E9E-B435-11E9-9278-68D0E5697425","call_identifier":"H2020","grant_number":"788183","name":"Alpha Shape Theory Extended"},{"call_identifier":"FWF","grant_number":"Z00342","name":"Mathematics, Computer Science","_id":"268116B8-B435-11E9-9278-68D0E5697425"},{"_id":"2561EBF4-B435-11E9-9278-68D0E5697425","name":"Persistence and stability of geometric complexes","grant_number":"I02979-N35","call_identifier":"FWF"}],"author":[{"orcid":"0000-0002-2548-617X","first_name":"Arseniy","id":"430D2C90-F248-11E8-B48F-1D18A9856A87","full_name":"Akopyan, Arseniy","last_name":"Akopyan"},{"last_name":"Edelsbrunner","full_name":"Edelsbrunner, Herbert","id":"3FB178DA-F248-11E8-B48F-1D18A9856A87","first_name":"Herbert","orcid":"0000-0002-9823-6833"},{"id":"2E36B656-F248-11E8-B48F-1D18A9856A87","first_name":"Ziga","last_name":"Virk","full_name":"Virk, Ziga"},{"full_name":"Wagner, Hubert","last_name":"Wagner","id":"379CA8B8-F248-11E8-B48F-1D18A9856A87","first_name":"Hubert"}],"type":"journal_article","article_processing_charge":"Yes","language":[{"iso":"eng"}],"quality_controlled":"1","status":"public","user_id":"317138e5-6ab7-11ef-aa6d-ffef3953e345","file_date_updated":"2025-09-08T07:55:48Z","external_id":{"pmid":["40870326"],"isi":["001557476000001"]},"title":"Tight bounds between the Jensen–Shannon divergence and the minmax divergence","OA_type":"gold","publication_status":"published","oa":1,"scopus_import":"1","article_type":"original","isi":1,"_id":"20293","acknowledgement":"This research received partial funding from the European Research Council (ERC) under\r\nthe European Union’s Horizon 2020 research and innovation programme, grant no. 788183, the\r\nWittgenstein Prize, Austrian Science Fund (FWF), grant no. Z 342-N31, the DFG Collaborative\r\nResearch Center TRR 109, ‘Discretization in Geometry and Dynamics’, Austrian Science Fund (FWF), grant no. I 02979-N35, and the 2022 Google Research Scholar Award for project ‘Algorithms for Topological Analysis of Neural Networks’. The APC was waived.","PlanS_conform":"1","month":"08","corr_author":"1","oa_version":"Published Version","OA_place":"publisher","has_accepted_license":"1","ec_funded":1,"volume":27,"publication":"Entropy","issue":"8","publisher":"MDPI","intvolume":" 27","abstract":[{"lang":"eng","text":"Motivated by questions arising at the intersection of information theory and geometry, we compare two dissimilarity measures between finite categorical distributions. One is the well-known Jensen–Shannon divergence, which is easy to compute and whose square root is a proper metric. The other is what we call the minmax divergence, which is harder to compute. Just like the Jensen–Shannon divergence, it arises naturally from the Kullback–Leibler divergence. The main contribution of this paper is a proof showing that the minmax divergence can be tightly approximated by the Jensen–Shannon divergence. The bounds suggest that the square root of the minmax divergence is a metric, and we prove that this is indeed true in the one-dimensional case. The general case remains open. Finally, we consider analogous questions in the context of another Bregman divergence and the corresponding Burbea–Rao (Jensen–Bregman) divergence."}],"file":[{"file_name":"2025_Entropy_Akopyan.pdf","creator":"dernst","checksum":"65c5399c4015d9c8abb8c7a96f3d7836","date_updated":"2025-09-08T07:55:48Z","access_level":"open_access","relation":"main_file","file_id":"20309","file_size":379340,"date_created":"2025-09-08T07:55:48Z","success":1,"content_type":"application/pdf"}],"tmp":{"short":"CC BY (4.0)","image":"/images/cc_by.png","name":"Creative Commons Attribution 4.0 International Public License (CC-BY 4.0)","legal_code_url":"https://creativecommons.org/licenses/by/4.0/legalcode"}}