---
res:
bibo_abstract:
- We revisit the problem of estimating entropy of discrete distributions from independent
samples, studied recently by Acharya, Orlitsky, Suresh and Tyagi (SODA 2015),
improving their upper and lower bounds on the necessary sample size n. For estimating
Renyi entropy of order alpha, up to constant accuracy and error probability, we
show the following * Upper bounds n = O(1) 2^{(1-1/alpha)H_alpha} for integer
alpha>1, as the worst case over distributions with Renyi entropy equal to H_alpha.
* Lower bounds n = Omega(1) K^{1-1/alpha} for any real alpha>1, with the constant
being an inverse polynomial of the accuracy, as the worst case over all distributions
on K elements. Our upper bounds essentially replace the alphabet size by a factor
exponential in the entropy, which offers improvements especially in low or medium
entropy regimes (interesting for example in anomaly detection). As for the lower
bounds, our proof explicitly shows how the complexity depends on both alphabet
and accuracy, partially solving the open problem posted in previous works. The
argument for upper bounds derives a clean identity for the variance of falling-power
sum of a multinomial distribution. Our approach for lower bounds utilizes convex
optimization to find a distribution with possibly worse estimation performance,
and may be of independent interest as a tool to work with Le Cam’s two point method.
@eng
bibo_authorlist:
- foaf_Person:
foaf_givenName: Maciej
foaf_name: Obremski, Maciej
foaf_surname: Obremski
- foaf_Person:
foaf_givenName: Maciej
foaf_name: Skórski, Maciej
foaf_surname: Skórski
foaf_workInfoHomepage: http://www.librecat.org/personId=EC09FA6A-02D0-11E9-8223-86B7C91467DD
bibo_doi: 10.4230/LIPIcs.APPROX-RANDOM.2017.20
bibo_volume: 81
dct_date: 2017^xs_gYear
dct_isPartOf:
- http://id.crossref.org/issn/18688969
dct_language: eng
dct_publisher: Schloss Dagstuhl - Leibniz-Zentrum für Informatik@
dct_title: Renyi entropy estimation revisited@
...