1

Let $X$ be a continuous random variable and $X^n$ its quantization that becomes finer with larger $n$. Let $Y$ be a deterministic function of $X$. Then we have that the conditional entropy $$H(Y|X) = 0$$ because $Y$ is a function of $X$.

Furthermore, we have $H(Y|X^n) = \infty$ for all $n$ because the distribution of $Y|X^n = x^n$ is continuous.

Now my question is, is it valid to say that $\lim_{n \to \infty} H(Y|X^n) = H(Y|X)$? Intuitively, I would say it is, but it seems impossible to construct an ($\epsilon$-$\delta$) proof for this. Any help is appreciated.

  • The differential entropy is not the entropy. Your first assertion (if $H$ is a differential entropy) is false. See eg http://math.stackexchange.com/a/1398471/312 or http://math.stackexchange.com/questions/454078/non-zero-conditional-differential-entropy-between-a-random-variable-and-a-functi – leonbloy Mar 16 '16 at 00:01
  • I don't consider differential entropy here. Since $Y$ is known for a given $X = x$, $H(Y|X = x) = 0$ and then also $H(Y|X) = 0$, right? –  Mar 16 '16 at 06:38
  • I don't understand how you define the entropy of a continuous variable ($Y$) if it's not the differential entropy. – leonbloy Mar 16 '16 at 10:52
  • Ok, so I agree that for continuous variables there is differential entropy (denote it with $h(Y)$). In most cases $h(Y) < \infty$ and $H(Y) = \infty$ (where $H(Y)$ is normal entropy). But now, since $Y$ is a function of $X$, the distribution of $Y|X=x$ is no longer continuous but discrete, or am I totally wrong here? Then $H(Y|X) = \int_{\mathcal X} H(Y|X=x) \text{d}P_X(x) = 0$. –  Mar 16 '16 at 11:03
  • Then you must use different definitions for the two entropies. The "infinite" entropy $H(Y|X^n) = \infty$ can only make sense if defined as a limit. Then, you have two iterated limits, and your assertion is ambiguous. Take as an example $Y=X$ and $X$ uniform in $[0,a]$ – leonbloy Mar 16 '16 at 11:45
  • To make sense of your paradox, I suggest you work out this simpler one. Consider $X_n$ uniform in $[0,1/n]$, $n=1,2, \cdots$ Then $H(X_n)=+\infty$ for all $n$. Now, consider a discrete variable $X$ taking the single value $0$ (a constant), hence $H(X)=0$. It's true (... in some sense) that $X_n \to X$ as $n\to \infty$ , But it's false that $H(X_n) \to H(X)$ – leonbloy Mar 16 '16 at 13:27
  • So $\lim_{n \to \infty} H(X_n)$ just doesn't exist? –  Mar 16 '16 at 13:41
  • Either does not exists, or it's infinite. But it's not zero. In general, if $g(n)>M$ $\forall n$, then it cannot happen that $\lim_{n \to \infty} g(n)< M$ – leonbloy Mar 16 '16 at 13:46
  • That makes sense now. Thanks a lot for your help. –  Mar 16 '16 at 13:49
  • @leonbloy: Even if $Y$ is continuous, since $Y=f(X)$, the statement $H(Y|X)=0$ is correct and meaningful. – Bernhard Apr 26 '16 at 13:19
  • @Bernhard What is $H(Y|X)$ when $Y$ is continuous? It's a differential entropy or not? How do you define it? – leonbloy Apr 26 '16 at 14:03
  • @leonbloy: $Y$ is continuous, but $Y|X=x$ is not. You define it simply as $H(Y|X)=\int_\Omega H(Y|X=x) dP_X(x)$. It is not a differential entropy, but a Shannon entropy. Note that the conditional differential entropy $h(Y|X)=-\infty$. – Bernhard Apr 27 '16 at 07:19
  • @Bernhard I'm ok with that. But, then, the (Shannon) entropy $H(Y|X=x) $ with $Y$ continuous is $+\infty$ unless the density consists of Dirac deltas (i.e. unless $Y$ is discrete - informally speaking). – leonbloy Apr 27 '16 at 11:16
  • @leonbloy: If $Y=g(X)$, then $Y$ is not continuous given that $X=x$. In other words, if $Y=g(X)$,. then, informally speaking, the density of $Y$ given $X=x$ is $\delta(y-g(x))$. The RV $Y|X=x$ is discrete, with a single mass point at $y=g(x)$. – Bernhard Apr 27 '16 at 12:58

1 Answers1

0

The probability distribution of $Y|X^n=x$ converges to the probability distribution $Y|X=x$ for increasing $n$. Nevertheless, and that is the problem in this case, entropy is not continuous in a converging sequence of probability measures (see, e.g., this paper).

Bernhard
  • 1,176