Questions tagged [renyi-entropy]

For questions related to Renyi entropy. The Rényi entropy of order $α$ , where $α\geq0$ and $α\neq1$, is defined as $\displaystyle \mathrm {H} {\alpha }(X)={\frac {1}{1-\alpha }}\log {\Bigg (}\sum _{i=1}^{n}p{i}^{\alpha }{\Bigg )}$.

The Rényi entropy of order $\alpha$ , where $\alpha \geq 0$ and $\alpha \neq 1$, is defined as $$\displaystyle \mathrm {H} _{\alpha }(X)={\frac {1}{1-\alpha }}\log {\Bigg (}\sum _{i=1}^{n}p_{i}^{\alpha }{\Bigg )}$$ Here, $X$ is a discrete random variable with possible outcomes in the set $\displaystyle {\mathcal {A}}=\{x_{1},x_{2},...,x_{n}\}$ and corresponding probabilities $\displaystyle p_{i}\doteq \Pr(X=x_{i})$ for $i=1,\dots ,n$. The logarithm is conventionally taken to be base $2$, especially in the context of information theory where bits are used.

11 questions
8
votes
1 answer

Is there a chain rule for Sibson's mutual information?

Mutual Information satisfies the chain rule: $$I(X,Y;Z) = I(X;Z) + I(Y;Z|X).$$ The chain rule is useful and the proof is simply linearity of expectations. Sometimes we want something stronger than mutual information. Mutual information is the…
4
votes
1 answer

How to prove $(p\log p+q\log q)^2\leq -\log(p^2+q^2)\log 2$?

I came up with the following conjecture while tacking another problem: Conjecture. Let $p \in [0, 1]$ and $q = 1-p$. Then $$ (p \log p + q \log q)^2 \leq -\log(p^2 + q^2)\log 2 $$ A numerical experiment shows that this must be true, but I have no…
Sangchul Lee
  • 181,930
4
votes
2 answers

Is the Rényi entropy a continuous function with respect to the parameter $\alpha$?

The Rényi entropy of order $\alpha$, where $\alpha > 0$ and $\alpha \neq 1$, is defined as $$ \mathrm{H}_\alpha(X)=\frac{1}{1-\alpha} \log \left(\sum_{i=1}^n p_i^\alpha\right) $$ Here, $X$ is a discrete random variable with possible outcomes in the…
Mark
  • 7,702
  • 6
  • 41
  • 80
3
votes
2 answers

Explicit examples of smooth entropy computation

Smooth classic entropies generalize the standard notions of entropy. This smoothing stands for a minimization/maximization over all events $\Omega$ such that $p(\Omega)\geq 1-\varepsilon$ for a given $\varepsilon\geq 0$. The smooth max and min…
2
votes
0 answers

What's the rationale behind the definitions of min- and max-entropies?

Going for example with the notation used in (Renner 2006), min- and max-entropies of a source $X$ with probability distribution $P_X$ are defined as $$H_{\rm max}(X) \equiv \log|\{x : \,\, P_X(x)>0\}| = \log|\operatorname{supp}(P_X)|, \\ H_{\rm…
glS
  • 7,963
1
vote
0 answers

Is the conditional smooth min Entropy of two random variables larger than the conditional smooth min Entropy of one

I have tried to prove the following inequality for smooth min-entropies \begin{equation} H_{min}^{\epsilon}(XY|K) \geq H_{min}^{\epsilon}(X|K) \end{equation} I started trying to prove it for the non-smooth min-entropy. The definition of the…
1
vote
1 answer

Functional derivative of the Rènyi divergence

I would like to calculate the functional derivative w.r.t the first term of the Renyi divergence \begin{align} D_\alpha(q||p)=\frac{1}{\alpha-1}\log\int q^\alpha(x) p^{1-\alpha}(x)dx \end{align} Personally, I would proceed as…
1
vote
1 answer

Maximum value of Rényi entropy

Given a discrete random variable $X$, which takes values in the alphabet $\mathcal {X}$ and is distributed according to $p:{\mathcal {X}}\to [0,1]$ the Shannon entropy is defined as: $$\mathrm {H} (X):=-\sum _{x\in {\mathcal {X}}}p(x)\log p(x)$$ As…
Mark
  • 7,702
  • 6
  • 41
  • 80
1
vote
1 answer

Convexity of Relative entropy for probability measures with no densities

I am trying to prove convexity of the relative entropy for general measures (without using densities wrt the Lebesgue measure but just Radon Nykodim derivatives). Given two measures, $\mu$ and $\nu$, define the relative entropy in the usual way: if…
1
vote
0 answers

What is the relation between (smoothed) max-entropy and source compressibility?

One thing mentioned in the comments of What's the rationale behind the definitions of min- and max-entropies? is the fact that the max-entropy quantifies the number of bits needed to compress a given source with zero error, in the single-shot…
glS
  • 7,963
1
vote
1 answer

Does the min-entropy $H_{\rm min}(X)\equiv \min_x\log(1/p_x)$ of a source $X$ have an operational interpretation?

This is a more specific version of this other related question of mine. Going for example with the notation used in (Renner 2006), min- and max-entropies of a source $X$ with probability distribution $P_X$ are defined as $$H_{\rm max}(X) \equiv…
glS
  • 7,963