Tight bounds between the Jensen–Shannon divergence and the minmax divergence

Akopyan A, Edelsbrunner H, Virk Z, Wagner H. 2025. Tight bounds between the Jensen–Shannon divergence and the minmax divergence. Entropy. 27(8), 854.

Download
OA 2025_Entropy_Akopyan.pdf 379.34 KB [Published Version]

Journal Article | Published | English

Scopus indexed

Corresponding author has ISTA affiliation

Department
Abstract
Motivated by questions arising at the intersection of information theory and geometry, we compare two dissimilarity measures between finite categorical distributions. One is the well-known Jensen–Shannon divergence, which is easy to compute and whose square root is a proper metric. The other is what we call the minmax divergence, which is harder to compute. Just like the Jensen–Shannon divergence, it arises naturally from the Kullback–Leibler divergence. The main contribution of this paper is a proof showing that the minmax divergence can be tightly approximated by the Jensen–Shannon divergence. The bounds suggest that the square root of the minmax divergence is a metric, and we prove that this is indeed true in the one-dimensional case. The general case remains open. Finally, we consider analogous questions in the context of another Bregman divergence and the corresponding Burbea–Rao (Jensen–Bregman) divergence.
Publishing Year
Date Published
2025-08-01
Journal Title
Entropy
Publisher
MDPI
Acknowledgement
This research received partial funding from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme, grant no. 788183, the Wittgenstein Prize, Austrian Science Fund (FWF), grant no. Z 342-N31, the DFG Collaborative Research Center TRR 109, ‘Discretization in Geometry and Dynamics’, Austrian Science Fund (FWF), grant no. I 02979-N35, and the 2022 Google Research Scholar Award for project ‘Algorithms for Topological Analysis of Neural Networks’. The APC was waived.
Volume
27
Issue
8
Article Number
854
eISSN
IST-REx-ID

Cite this

Akopyan A, Edelsbrunner H, Virk Z, Wagner H. Tight bounds between the Jensen–Shannon divergence and the minmax divergence. Entropy. 2025;27(8). doi:10.3390/e27080854
Akopyan, A., Edelsbrunner, H., Virk, Z., & Wagner, H. (2025). Tight bounds between the Jensen–Shannon divergence and the minmax divergence. Entropy. MDPI. https://doi.org/10.3390/e27080854
Akopyan, Arseniy, Herbert Edelsbrunner, Ziga Virk, and Hubert Wagner. “Tight Bounds between the Jensen–Shannon Divergence and the Minmax Divergence.” Entropy. MDPI, 2025. https://doi.org/10.3390/e27080854.
A. Akopyan, H. Edelsbrunner, Z. Virk, and H. Wagner, “Tight bounds between the Jensen–Shannon divergence and the minmax divergence,” Entropy, vol. 27, no. 8. MDPI, 2025.
Akopyan A, Edelsbrunner H, Virk Z, Wagner H. 2025. Tight bounds between the Jensen–Shannon divergence and the minmax divergence. Entropy. 27(8), 854.
Akopyan, Arseniy, et al. “Tight Bounds between the Jensen–Shannon Divergence and the Minmax Divergence.” Entropy, vol. 27, no. 8, 854, MDPI, 2025, doi:10.3390/e27080854.
All files available under the following license(s):
Creative Commons Attribution 4.0 International Public License (CC-BY 4.0):
Main File(s)
File Name
Access Level
OA Open Access
Date Uploaded
2025-09-08
MD5 Checksum
65c5399c4015d9c8abb8c7a96f3d7836


Export

Marked Publications

Open Data ISTA Research Explorer

Web of Science

View record in Web of Science®

Sources

PMID: 40870326
PubMed | Europe PMC

Search this title in

Google Scholar