2

Going for example with the notation used in (Renner 2006), min- and max-entropies of a source $X$ with probability distribution $P_X$ are defined as $$H_{\rm max}(X) \equiv \log|\{x : \,\, P_X(x)>0\}| = \log|\operatorname{supp}(P_X)|, \\ H_{\rm min}(X) \equiv \min_x \log\left(\frac{1}{P_X(x)}\right) = -\log \max_x P_X(x).$$ I can guess that these definitions probably come originally from taking Renyi entropies for $\alpha\to0$ and $\alpha\to\infty$. However, I wonder, is there any other reason why we would want to use this definition of $H_{\rm max}$, rather than $$\tilde H_{\rm max}(X) \equiv \max_x \log\left(\frac{1}{P_X(x)}\right) = -\log \min_x P_X(x).$$ Such definition is clearly more similar in spirit to $H_{\rm min}$, and arguably makes the name max-entropy a bit more intuitive. It also still satisfies $H_{\rm min}(X)\le H(X)\le \tilde H_{\rm max}(X)$.

Is there a good reason to use $H_{\rm max}$ rather than $\tilde H_{\rm max}$ in the context of single-shot entropies? Would results still hold for this modified quantity, or is there any other obvious reason not to use this other definition?

glS
  • 7,963
  • 1
    Your definition of $H_{\max}$ is easily made infinite and I don't see a nice operational interpretation for it. Whereas the standard definition at least operationally corresponds to the trivial upper bound on the source compression. – Rammus Aug 19 '22 at 14:57
  • @Rammus good point about it possibly being infinite. Easily fixable asking for $\displaystyle\max_{x: P_X(x)>0}$ though. Operationally-wise, you're probably right. Indeed, I'd say that's part of my question. But is the "trivial upper bound" an "operational interpretation"? Isn't $\tilde H_{\rm max}$ also a "trivial upper bound" for $H(X)$? – glS Aug 19 '22 at 15:05
  • 2
    Maybe trivial was a poor choice of words. Imagine you take $\lceil H_{\max} \rceil$ then this is an optimal bound on the number of bits needed to compress your source with zero error in the single copy setting (non asymptotic). Even with your new definition the quantity is unbounded and so I don't think it will correspond to anything operational (maybe it does though). – Rammus Aug 19 '22 at 15:30
  • @Rammus thanks, that makes sense. Btw, do you know a good source showing why that interpretation holds for $H_{\rm max}$? It sounds similar to what Tomamichel discusses in sec 1.2 of https://arxiv.org/abs/1504.00233, but also not quite the same. Also, does $H_{\rm min}$ have a similar interpretation as a bound for single-shot compressibility? – glS Aug 19 '22 at 15:47
  • $2^{-H_{min}(X)}$ is the maximal probability of guessing the value of a random sample for $X.$ – kodlu Aug 19 '22 at 15:58
  • In fact $\tilde{H}_{\text{max}}$ is a special case of the Rényi entropy ($\alpha \to -\infty$), but there isn't much about it in the literature. One reason is that the $\alpha$-norm that appears in the definition of Rényi entropy isn't a norm anymore. I'll shamelessly refer you to https://arxiv.org/pdf/2202.03951.pdf, which explore a related quantity called Sibson's $\alpha$-Mutual Information for the case $\alpha<0$. Unfortunately no operational meaning is given but it is shown how one can leverage in certain applications, it'll hopefully be useful! – adrien_vdb Aug 26 '22 at 09:08

0 Answers0